[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 32935 1726853714.33944: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-Qi7 executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 32935 1726853714.34393: Added group all to inventory 32935 1726853714.34395: Added group ungrouped to inventory 32935 1726853714.34400: Group all now contains ungrouped 32935 1726853714.34403: Examining possible inventory source: /tmp/network-iHm/inventory.yml 32935 1726853714.53527: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 32935 1726853714.53590: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 32935 1726853714.53613: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 32935 1726853714.53693: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 32935 1726853714.53764: Loaded config def from plugin (inventory/script) 32935 1726853714.53766: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 32935 1726853714.53808: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 32935 1726853714.53896: Loaded config def from plugin (inventory/yaml) 32935 1726853714.53898: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 32935 1726853714.53986: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 32935 1726853714.54495: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 32935 1726853714.54498: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 32935 1726853714.54501: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 32935 1726853714.54507: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 32935 1726853714.54512: Loading data from /tmp/network-iHm/inventory.yml 32935 1726853714.54580: /tmp/network-iHm/inventory.yml was not parsable by auto 32935 1726853714.54643: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 32935 1726853714.54683: Loading data from /tmp/network-iHm/inventory.yml 32935 1726853714.54762: group all already in inventory 32935 1726853714.54769: set inventory_file for managed_node1 32935 1726853714.54775: set inventory_dir for managed_node1 32935 1726853714.54776: Added host managed_node1 to inventory 32935 1726853714.54779: Added host managed_node1 to group all 32935 1726853714.54780: set ansible_host for managed_node1 32935 1726853714.54780: set ansible_ssh_extra_args for managed_node1 32935 1726853714.54784: set inventory_file for managed_node2 32935 1726853714.54786: set inventory_dir for managed_node2 32935 1726853714.54787: Added host managed_node2 to inventory 32935 1726853714.54789: Added host managed_node2 to group all 32935 1726853714.54789: set ansible_host for managed_node2 32935 1726853714.54790: set ansible_ssh_extra_args for managed_node2 32935 1726853714.54793: set inventory_file for managed_node3 32935 1726853714.54795: set inventory_dir for managed_node3 32935 1726853714.54796: Added host managed_node3 to inventory 32935 1726853714.54797: Added host managed_node3 to group all 32935 1726853714.54798: set ansible_host for managed_node3 32935 1726853714.54799: set ansible_ssh_extra_args for managed_node3 32935 1726853714.54801: Reconcile groups and hosts in inventory. 32935 1726853714.54805: Group ungrouped now contains managed_node1 32935 1726853714.54807: Group ungrouped now contains managed_node2 32935 1726853714.54809: Group ungrouped now contains managed_node3 32935 1726853714.54883: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 32935 1726853714.54998: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 32935 1726853714.55045: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 32935 1726853714.55073: Loaded config def from plugin (vars/host_group_vars) 32935 1726853714.55076: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 32935 1726853714.55082: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 32935 1726853714.55090: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 32935 1726853714.55132: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 32935 1726853714.55430: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853714.55517: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 32935 1726853714.55556: Loaded config def from plugin (connection/local) 32935 1726853714.55559: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 32935 1726853714.56556: Loaded config def from plugin (connection/paramiko_ssh) 32935 1726853714.56559: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 32935 1726853714.58684: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 32935 1726853714.58751: Loaded config def from plugin (connection/psrp) 32935 1726853714.58755: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 32935 1726853714.59608: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 32935 1726853714.59649: Loaded config def from plugin (connection/ssh) 32935 1726853714.59652: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 32935 1726853714.61667: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 32935 1726853714.61728: Loaded config def from plugin (connection/winrm) 32935 1726853714.61732: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 32935 1726853714.61764: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 32935 1726853714.61830: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 32935 1726853714.61899: Loaded config def from plugin (shell/cmd) 32935 1726853714.61901: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 32935 1726853714.61928: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 32935 1726853714.61996: Loaded config def from plugin (shell/powershell) 32935 1726853714.61998: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 32935 1726853714.62051: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 32935 1726853714.62225: Loaded config def from plugin (shell/sh) 32935 1726853714.62227: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 32935 1726853714.62269: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 32935 1726853714.62387: Loaded config def from plugin (become/runas) 32935 1726853714.62389: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 32935 1726853714.62568: Loaded config def from plugin (become/su) 32935 1726853714.62570: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 32935 1726853714.62736: Loaded config def from plugin (become/sudo) 32935 1726853714.62739: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 32935 1726853714.62775: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_vlan_mtu_nm.yml 32935 1726853714.63168: in VariableManager get_vars() 32935 1726853714.63189: done with get_vars() 32935 1726853714.63312: trying /usr/local/lib/python3.12/site-packages/ansible/modules 32935 1726853714.66642: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 32935 1726853714.66968: in VariableManager get_vars() 32935 1726853714.66976: done with get_vars() 32935 1726853714.66979: variable 'playbook_dir' from source: magic vars 32935 1726853714.66980: variable 'ansible_playbook_python' from source: magic vars 32935 1726853714.66981: variable 'ansible_config_file' from source: magic vars 32935 1726853714.66982: variable 'groups' from source: magic vars 32935 1726853714.66982: variable 'omit' from source: magic vars 32935 1726853714.66983: variable 'ansible_version' from source: magic vars 32935 1726853714.66984: variable 'ansible_check_mode' from source: magic vars 32935 1726853714.66985: variable 'ansible_diff_mode' from source: magic vars 32935 1726853714.66985: variable 'ansible_forks' from source: magic vars 32935 1726853714.66986: variable 'ansible_inventory_sources' from source: magic vars 32935 1726853714.66987: variable 'ansible_skip_tags' from source: magic vars 32935 1726853714.66987: variable 'ansible_limit' from source: magic vars 32935 1726853714.66988: variable 'ansible_run_tags' from source: magic vars 32935 1726853714.66989: variable 'ansible_verbosity' from source: magic vars 32935 1726853714.67025: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_vlan_mtu.yml 32935 1726853714.68079: in VariableManager get_vars() 32935 1726853714.68098: done with get_vars() 32935 1726853714.68137: in VariableManager get_vars() 32935 1726853714.68151: done with get_vars() 32935 1726853714.68190: in VariableManager get_vars() 32935 1726853714.68213: done with get_vars() 32935 1726853714.68561: in VariableManager get_vars() 32935 1726853714.68576: done with get_vars() 32935 1726853714.68580: variable 'omit' from source: magic vars 32935 1726853714.68599: variable 'omit' from source: magic vars 32935 1726853714.68633: in VariableManager get_vars() 32935 1726853714.68645: done with get_vars() 32935 1726853714.68901: in VariableManager get_vars() 32935 1726853714.68914: done with get_vars() 32935 1726853714.68948: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 32935 1726853714.69362: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 32935 1726853714.69688: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 32935 1726853714.70928: in VariableManager get_vars() 32935 1726853714.70950: done with get_vars() 32935 1726853714.71762: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 32935 1726853714.72108: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 32935 1726853714.75521: in VariableManager get_vars() 32935 1726853714.75542: done with get_vars() 32935 1726853714.75582: in VariableManager get_vars() 32935 1726853714.75617: done with get_vars() 32935 1726853714.76081: in VariableManager get_vars() 32935 1726853714.76099: done with get_vars() 32935 1726853714.76104: variable 'omit' from source: magic vars 32935 1726853714.76116: variable 'omit' from source: magic vars 32935 1726853714.76149: in VariableManager get_vars() 32935 1726853714.76163: done with get_vars() 32935 1726853714.76231: in VariableManager get_vars() 32935 1726853714.76247: done with get_vars() 32935 1726853714.76483: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 32935 1726853714.76592: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 32935 1726853714.76669: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 32935 1726853714.77833: in VariableManager get_vars() 32935 1726853714.77856: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 32935 1726853714.81418: in VariableManager get_vars() 32935 1726853714.81442: done with get_vars() 32935 1726853714.81480: in VariableManager get_vars() 32935 1726853714.81499: done with get_vars() 32935 1726853714.81550: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 32935 1726853714.81565: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 32935 1726853714.83546: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 32935 1726853714.83817: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 32935 1726853714.83820: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-Qi7/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) 32935 1726853714.83854: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 32935 1726853714.83881: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 32935 1726853714.84162: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 32935 1726853714.84224: Loaded config def from plugin (callback/default) 32935 1726853714.84227: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 32935 1726853714.86032: Loaded config def from plugin (callback/junit) 32935 1726853714.86035: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 32935 1726853714.86107: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 32935 1726853714.86269: Loaded config def from plugin (callback/minimal) 32935 1726853714.86273: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 32935 1726853714.86311: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 32935 1726853714.86389: Loaded config def from plugin (callback/tree) 32935 1726853714.86392: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 32935 1726853714.86527: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 32935 1726853714.86529: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-Qi7/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_vlan_mtu_nm.yml ************************************************ 2 plays in /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_vlan_mtu_nm.yml 32935 1726853714.86555: in VariableManager get_vars() 32935 1726853714.86655: done with get_vars() 32935 1726853714.86664: in VariableManager get_vars() 32935 1726853714.86680: done with get_vars() 32935 1726853714.86685: variable 'omit' from source: magic vars 32935 1726853714.86727: in VariableManager get_vars() 32935 1726853714.86741: done with get_vars() 32935 1726853714.86766: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_vlan_mtu.yml' with nm as provider] ********* 32935 1726853714.87654: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 32935 1726853714.87870: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 32935 1726853714.87916: getting the remaining hosts for this loop 32935 1726853714.87917: done getting the remaining hosts for this loop 32935 1726853714.87920: getting the next task for host managed_node1 32935 1726853714.87924: done getting next task for host managed_node1 32935 1726853714.87925: ^ task is: TASK: Gathering Facts 32935 1726853714.87927: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853714.87929: getting variables 32935 1726853714.87930: in VariableManager get_vars() 32935 1726853714.87939: Calling all_inventory to load vars for managed_node1 32935 1726853714.87941: Calling groups_inventory to load vars for managed_node1 32935 1726853714.87943: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853714.87953: Calling all_plugins_play to load vars for managed_node1 32935 1726853714.87962: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853714.87965: Calling groups_plugins_play to load vars for managed_node1 32935 1726853714.88001: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853714.88084: done with get_vars() 32935 1726853714.88091: done getting variables 32935 1726853714.88147: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_vlan_mtu_nm.yml:6 Friday 20 September 2024 13:35:14 -0400 (0:00:00.017) 0:00:00.017 ****** 32935 1726853714.88168: entering _queue_task() for managed_node1/gather_facts 32935 1726853714.88169: Creating lock for gather_facts 32935 1726853714.88525: worker is 1 (out of 1 available) 32935 1726853714.88534: exiting _queue_task() for managed_node1/gather_facts 32935 1726853714.88548: done queuing things up, now waiting for results queue to drain 32935 1726853714.88550: waiting for pending results... 32935 1726853714.88989: running TaskExecutor() for managed_node1/TASK: Gathering Facts 32935 1726853714.89084: in run() - task 02083763-bbaf-84df-441d-0000000000af 32935 1726853714.89183: variable 'ansible_search_path' from source: unknown 32935 1726853714.89378: calling self._execute() 32935 1726853714.89401: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853714.89411: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853714.89423: variable 'omit' from source: magic vars 32935 1726853714.89638: variable 'omit' from source: magic vars 32935 1726853714.89706: variable 'omit' from source: magic vars 32935 1726853714.89748: variable 'omit' from source: magic vars 32935 1726853714.89856: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32935 1726853714.89958: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32935 1726853714.89998: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32935 1726853714.90059: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853714.90246: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853714.90249: variable 'inventory_hostname' from source: host vars for 'managed_node1' 32935 1726853714.90251: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853714.90253: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853714.90405: Set connection var ansible_timeout to 10 32935 1726853714.90474: Set connection var ansible_shell_type to sh 32935 1726853714.90487: Set connection var ansible_pipelining to False 32935 1726853714.90493: Set connection var ansible_connection to ssh 32935 1726853714.90501: Set connection var ansible_shell_executable to /bin/sh 32935 1726853714.90510: Set connection var ansible_module_compression to ZIP_DEFLATED 32935 1726853714.90536: variable 'ansible_shell_executable' from source: unknown 32935 1726853714.90577: variable 'ansible_connection' from source: unknown 32935 1726853714.90585: variable 'ansible_module_compression' from source: unknown 32935 1726853714.90884: variable 'ansible_shell_type' from source: unknown 32935 1726853714.90887: variable 'ansible_shell_executable' from source: unknown 32935 1726853714.90890: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853714.90892: variable 'ansible_pipelining' from source: unknown 32935 1726853714.90894: variable 'ansible_timeout' from source: unknown 32935 1726853714.90896: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853714.91072: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32935 1726853714.91222: variable 'omit' from source: magic vars 32935 1726853714.91234: starting attempt loop 32935 1726853714.91242: running the handler 32935 1726853714.91264: variable 'ansible_facts' from source: unknown 32935 1726853714.91343: _low_level_execute_command(): starting 32935 1726853714.91357: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32935 1726853714.93072: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32935 1726853714.93257: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853714.93316: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853714.93404: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853714.93444: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853714.93766: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853714.95229: stdout chunk (state=3): >>>/root <<< 32935 1726853714.95386: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853714.95437: stderr chunk (state=3): >>><<< 32935 1726853714.95446: stdout chunk (state=3): >>><<< 32935 1726853714.95480: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853714.95507: _low_level_execute_command(): starting 32935 1726853714.95574: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853714.954953-32973-57050967083471 `" && echo ansible-tmp-1726853714.954953-32973-57050967083471="` echo /root/.ansible/tmp/ansible-tmp-1726853714.954953-32973-57050967083471 `" ) && sleep 0' 32935 1726853714.96990: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32935 1726853714.96998: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853714.97001: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853714.97069: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853714.97219: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853714.97414: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853714.97479: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853714.99430: stdout chunk (state=3): >>>ansible-tmp-1726853714.954953-32973-57050967083471=/root/.ansible/tmp/ansible-tmp-1726853714.954953-32973-57050967083471 <<< 32935 1726853714.99485: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853714.99876: stderr chunk (state=3): >>><<< 32935 1726853714.99880: stdout chunk (state=3): >>><<< 32935 1726853714.99882: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853714.954953-32973-57050967083471=/root/.ansible/tmp/ansible-tmp-1726853714.954953-32973-57050967083471 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853714.99885: variable 'ansible_module_compression' from source: unknown 32935 1726853714.99886: ANSIBALLZ: Using generic lock for ansible.legacy.setup 32935 1726853714.99888: ANSIBALLZ: Acquiring lock 32935 1726853714.99891: ANSIBALLZ: Lock acquired: 140683294872048 32935 1726853714.99892: ANSIBALLZ: Creating module 32935 1726853715.56503: ANSIBALLZ: Writing module into payload 32935 1726853715.56669: ANSIBALLZ: Writing module 32935 1726853715.56708: ANSIBALLZ: Renaming module 32935 1726853715.56720: ANSIBALLZ: Done creating module 32935 1726853715.56767: variable 'ansible_facts' from source: unknown 32935 1726853715.56784: variable 'inventory_hostname' from source: host vars for 'managed_node1' 32935 1726853715.56807: _low_level_execute_command(): starting 32935 1726853715.56818: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 32935 1726853715.57563: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32935 1726853715.57591: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853715.57607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853715.57626: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32935 1726853715.57695: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853715.57747: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853715.57777: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853715.57803: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853715.57977: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853715.59626: stdout chunk (state=3): >>>PLATFORM <<< 32935 1726853715.59683: stdout chunk (state=3): >>>Linux <<< 32935 1726853715.59720: stdout chunk (state=3): >>>FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 32935 1726853715.59887: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853715.59934: stderr chunk (state=3): >>><<< 32935 1726853715.59950: stdout chunk (state=3): >>><<< 32935 1726853715.60172: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853715.60179 [managed_node1]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 32935 1726853715.60183: _low_level_execute_command(): starting 32935 1726853715.60185: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 32935 1726853715.60378: Sending initial data 32935 1726853715.60397: Sent initial data (1181 bytes) 32935 1726853715.61478: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853715.61483: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32935 1726853715.61495: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853715.61595: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853715.61666: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853715.61680: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853715.61786: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853715.65167: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 32935 1726853715.65578: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853715.65695: stderr chunk (state=3): >>><<< 32935 1726853715.65775: stdout chunk (state=3): >>><<< 32935 1726853715.65779: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853715.66179: variable 'ansible_facts' from source: unknown 32935 1726853715.66182: variable 'ansible_facts' from source: unknown 32935 1726853715.66185: variable 'ansible_module_compression' from source: unknown 32935 1726853715.66187: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-32935vj31k4ae/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 32935 1726853715.66278: variable 'ansible_facts' from source: unknown 32935 1726853715.66574: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853714.954953-32973-57050967083471/AnsiballZ_setup.py 32935 1726853715.66727: Sending initial data 32935 1726853715.66738: Sent initial data (152 bytes) 32935 1726853715.67397: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853715.67478: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 32935 1726853715.69423: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32935 1726853715.69462: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32935 1726853715.69503: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32935vj31k4ae/tmp0jqakqvw /root/.ansible/tmp/ansible-tmp-1726853714.954953-32973-57050967083471/AnsiballZ_setup.py <<< 32935 1726853715.69511: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853714.954953-32973-57050967083471/AnsiballZ_setup.py" <<< 32935 1726853715.69550: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-32935vj31k4ae/tmp0jqakqvw" to remote "/root/.ansible/tmp/ansible-tmp-1726853714.954953-32973-57050967083471/AnsiballZ_setup.py" <<< 32935 1726853715.69560: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853714.954953-32973-57050967083471/AnsiballZ_setup.py" <<< 32935 1726853715.71056: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853715.71277: stdout chunk (state=3): >>><<< 32935 1726853715.71281: stderr chunk (state=3): >>><<< 32935 1726853715.71284: done transferring module to remote 32935 1726853715.71286: _low_level_execute_command(): starting 32935 1726853715.71289: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853714.954953-32973-57050967083471/ /root/.ansible/tmp/ansible-tmp-1726853714.954953-32973-57050967083471/AnsiballZ_setup.py && sleep 0' 32935 1726853715.71770: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853715.71777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 32935 1726853715.71789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853715.71875: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853715.72042: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853715.72065: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853715.73829: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853715.73833: stderr chunk (state=3): >>><<< 32935 1726853715.73837: stdout chunk (state=3): >>><<< 32935 1726853715.73856: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853715.73859: _low_level_execute_command(): starting 32935 1726853715.73875: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853714.954953-32973-57050967083471/AnsiballZ_setup.py && sleep 0' 32935 1726853715.74484: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32935 1726853715.74527: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853715.74595: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853715.74613: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853715.74643: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853715.74751: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853715.76809: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 32935 1726853715.76836: stdout chunk (state=3): >>>import _imp # builtin <<< 32935 1726853715.76869: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 32935 1726853715.76940: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 32935 1726853715.76980: stdout chunk (state=3): >>>import 'posix' # <<< 32935 1726853715.77021: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 32935 1726853715.77047: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 32935 1726853715.77106: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 32935 1726853715.77140: stdout chunk (state=3): >>>import '_codecs' # <<< 32935 1726853715.77152: stdout chunk (state=3): >>>import 'codecs' # <<< 32935 1726853715.77199: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 32935 1726853715.77237: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a63104d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a62dfb30> <<< 32935 1726853715.77264: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 32935 1726853715.77305: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a6312a50> import '_signal' # <<< 32935 1726853715.77332: stdout chunk (state=3): >>>import '_abc' # import 'abc' # <<< 32935 1726853715.77348: stdout chunk (state=3): >>>import 'io' # <<< 32935 1726853715.77375: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 32935 1726853715.77464: stdout chunk (state=3): >>>import '_collections_abc' # <<< 32935 1726853715.77491: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 32935 1726853715.77546: stdout chunk (state=3): >>>import 'os' # <<< 32935 1726853715.77567: stdout chunk (state=3): >>>import '_sitebuiltins' # Processing user site-packages Processing global site-packages <<< 32935 1726853715.77596: stdout chunk (state=3): >>>Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 32935 1726853715.77621: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 32935 1726853715.77643: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a60c1130> <<< 32935 1726853715.77710: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 32935 1726853715.77740: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a60c1fa0> <<< 32935 1726853715.77781: stdout chunk (state=3): >>>import 'site' # <<< 32935 1726853715.77785: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 32935 1726853715.78252: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 32935 1726853715.78344: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 32935 1726853715.78438: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a60ffe00> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 32935 1726853715.78524: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a60ffec0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 32935 1726853715.78613: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # <<< 32935 1726853715.78616: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a6137830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a6137ec0> <<< 32935 1726853715.78707: stdout chunk (state=3): >>>import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a6117ad0> <<< 32935 1726853715.78777: stdout chunk (state=3): >>>import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a61151f0> <<< 32935 1726853715.78848: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a60fcfb0> <<< 32935 1726853715.78915: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # <<< 32935 1726853715.78940: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 32935 1726853715.78986: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a6157740> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a6156360> <<< 32935 1726853715.79009: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a61160c0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a6154bf0> <<< 32935 1726853715.79082: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 32935 1726853715.79128: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a618c7d0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a60fc230> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 32935 1726853715.79258: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1a618cc80> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a618cb30> <<< 32935 1726853715.79262: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1a618cf20> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a60fad50> <<< 32935 1726853715.79413: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a618d610> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a618d2e0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a618e510> import 'importlib.util' # <<< 32935 1726853715.79490: stdout chunk (state=3): >>>import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 32935 1726853715.79568: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a61a46e0> <<< 32935 1726853715.79599: stdout chunk (state=3): >>>import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1a61a5df0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 32935 1726853715.79665: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a61a6c90> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1a61a72f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a61a61e0> <<< 32935 1726853715.79683: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 32935 1726853715.79739: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1a61a7d70> <<< 32935 1726853715.79811: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a61a74a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a618e4b0> <<< 32935 1726853715.79855: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 32935 1726853715.79909: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 32935 1726853715.79954: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1a5e9bc80> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1a5ec47d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a5ec4530> <<< 32935 1726853715.80062: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1a5ec4740> <<< 32935 1726853715.80125: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 32935 1726853715.80255: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1a5ec5100> <<< 32935 1726853715.80460: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1a5ec5ac0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a5ec49b0> <<< 32935 1726853715.80521: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a5e99e20> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a5ec6ea0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a5ec5be0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a618ec00> <<< 32935 1726853715.80617: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 32935 1726853715.80684: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a5eef230> <<< 32935 1726853715.80800: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 32935 1726853715.80836: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a5f135c0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 32935 1726853715.80915: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 32935 1726853715.81012: stdout chunk (state=3): >>>import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a5f743e0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 32935 1726853715.81082: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 32935 1726853715.81180: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a5f76b40> <<< 32935 1726853715.81236: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a5f74500> <<< 32935 1726853715.81330: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a5f353d0> <<< 32935 1726853715.81379: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a5d7d460> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a5f123c0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a5ec7e00> <<< 32935 1726853715.81539: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7ff1a5f129c0> <<< 32935 1726853715.81815: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_wepwvlc8/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available <<< 32935 1726853715.81977: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 32935 1726853715.82009: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 32935 1726853715.82080: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a5de31d0> <<< 32935 1726853715.82119: stdout chunk (state=3): >>>import '_typing' # <<< 32935 1726853715.82328: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a5dc20c0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a5dc1250> # zipimport: zlib available <<< 32935 1726853715.82352: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available <<< 32935 1726853715.83733: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853715.85275: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' <<< 32935 1726853715.85279: stdout chunk (state=3): >>>import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a5de0f20> <<< 32935 1726853715.85303: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py <<< 32935 1726853715.85307: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 32935 1726853715.85327: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 32935 1726853715.85348: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 32935 1726853715.85367: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 32935 1726853715.85426: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1a5e12bd0> <<< 32935 1726853715.85476: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a5e12960> <<< 32935 1726853715.85526: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a5e122a0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 32935 1726853715.85531: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 32935 1726853715.85581: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a5e12cf0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a5de3e60> <<< 32935 1726853715.85619: stdout chunk (state=3): >>>import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1a5e138f0> <<< 32935 1726853715.85643: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1a5e13b30> <<< 32935 1726853715.85668: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 32935 1726853715.85741: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # <<< 32935 1726853715.85810: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a5e13ef0> <<< 32935 1726853715.85912: stdout chunk (state=3): >>>import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 32935 1726853715.85916: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a5725d30> <<< 32935 1726853715.85943: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1a5727950> <<< 32935 1726853715.85974: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 32935 1726853715.86009: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 32935 1726853715.86021: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a5728350> <<< 32935 1726853715.86057: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 32935 1726853715.86094: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a5729220> <<< 32935 1726853715.86120: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 32935 1726853715.86176: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 32935 1726853715.86217: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py <<< 32935 1726853715.86220: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 32935 1726853715.86306: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a572bfb0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1a57300e0> <<< 32935 1726853715.86321: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a572a270> <<< 32935 1726853715.86343: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 32935 1726853715.86414: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 32935 1726853715.86438: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 32935 1726853715.86612: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 32935 1726853715.86628: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a5733f20> import '_tokenize' # <<< 32935 1726853715.86743: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a57329f0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a5732750> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 32935 1726853715.86864: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a5732cc0> <<< 32935 1726853715.86913: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a572a780> <<< 32935 1726853715.86916: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1a5777f20> <<< 32935 1726853715.86941: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a5778140> <<< 32935 1726853715.86987: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 32935 1726853715.87012: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 32935 1726853715.87079: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1a5779d00> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a5779ac0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 32935 1726853715.87111: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 32935 1726853715.87189: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1a577c260> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a577a3c0> <<< 32935 1726853715.87248: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 32935 1726853715.87295: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 32935 1726853715.87298: stdout chunk (state=3): >>>import '_string' # <<< 32935 1726853715.87392: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a577fa40> <<< 32935 1726853715.87652: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a577c410> <<< 32935 1726853715.87680: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1a5780ad0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1a5780a70> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' <<< 32935 1726853715.87690: stdout chunk (state=3): >>>import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1a5780d40> <<< 32935 1726853715.87700: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a5778410> <<< 32935 1726853715.87732: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 32935 1726853715.87747: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 32935 1726853715.87774: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 32935 1726853715.87824: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 32935 1726853715.87848: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1a5608380> <<< 32935 1726853715.87995: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1a56094c0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a5782b10> <<< 32935 1726853715.88355: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1a5783ec0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a5782780> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 32935 1726853715.88388: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853715.88423: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available <<< 32935 1726853715.88566: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853715.89180: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853715.89747: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 32935 1726853715.90052: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1a5611700> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a56123f0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a56095e0> <<< 32935 1726853715.90055: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available <<< 32935 1726853715.90240: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available <<< 32935 1726853715.90428: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853715.90551: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a5612570> <<< 32935 1726853715.90576: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853715.91403: stdout chunk (state=3): >>># zipimport: zlib available<<< 32935 1726853715.91520: stdout chunk (state=3): >>> <<< 32935 1726853715.92130: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853715.92277: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853715.92551: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available <<< 32935 1726853715.92639: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 32935 1726853715.92826: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.errors' # <<< 32935 1726853715.92910: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853715.92953: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available<<< 32935 1726853715.93008: stdout chunk (state=3): >>> # zipimport: zlib available <<< 32935 1726853715.93286: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 32935 1726853715.93347: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853715.93506: stdout chunk (state=3): >>># zipimport: zlib available<<< 32935 1726853715.93632: stdout chunk (state=3): >>> <<< 32935 1726853715.93897: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 32935 1726853715.94004: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 32935 1726853715.94052: stdout chunk (state=3): >>>import '_ast' # <<< 32935 1726853715.94129: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a5613650> <<< 32935 1726853715.94180: stdout chunk (state=3): >>># zipimport: zlib available<<< 32935 1726853715.94208: stdout chunk (state=3): >>> <<< 32935 1726853715.94298: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853715.94448: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 32935 1726853715.94474: stdout chunk (state=3): >>> import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 32935 1726853715.94566: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available<<< 32935 1726853715.94715: stdout chunk (state=3): >>> import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available <<< 32935 1726853715.94750: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853715.94833: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853715.94933: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 32935 1726853715.95119: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1a561e270> <<< 32935 1726853715.95170: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a56198b0> <<< 32935 1726853715.95219: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 32935 1726853715.95237: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853715.95814: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 32935 1726853715.95817: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a5706930> <<< 32935 1726853715.95832: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a57fe600> <<< 32935 1726853715.95875: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a561e000> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a5616630> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 32935 1726853715.95949: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 32935 1726853715.95967: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 32935 1726853715.96046: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853715.96173: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 32935 1726853715.96306: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 32935 1726853715.96435: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available <<< 32935 1726853715.96638: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 32935 1726853715.96691: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853715.97067: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 32935 1726853715.97074: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 32935 1726853715.97220: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a56b2150> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 32935 1726853715.97228: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 32935 1726853715.97286: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a52c3fe0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1a52dc380> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a569b020> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a56b2c90> <<< 32935 1726853715.97334: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a56b0830> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a56b04a0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 32935 1726853715.97417: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 32935 1726853715.97453: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 32935 1726853715.97694: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1a52df350> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a52dec00> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1a52dede0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a52de030> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 32935 1726853715.97698: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a52df440> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 32935 1726853715.97769: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1a532df70> <<< 32935 1726853715.97806: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a52dff50> <<< 32935 1726853715.97857: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a56b0530> import 'ansible.module_utils.facts.timeout' # <<< 32935 1726853715.97860: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available <<< 32935 1726853715.97881: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other' # # zipimport: zlib available <<< 32935 1726853715.98062: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853715.98113: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # <<< 32935 1726853715.98129: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # <<< 32935 1726853715.98145: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853715.98168: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853715.98207: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available <<< 32935 1726853715.98330: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available <<< 32935 1726853715.98415: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # <<< 32935 1726853715.98418: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853715.98457: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853715.98520: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853715.98570: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853715.98641: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # <<< 32935 1726853715.98653: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853715.99127: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853715.99760: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available <<< 32935 1726853715.99809: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853715.99885: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853715.99925: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853715.99967: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # <<< 32935 1726853715.99987: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853716.00016: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853716.00044: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 32935 1726853716.00066: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853716.00179: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853716.00222: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available <<< 32935 1726853716.00255: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853716.00302: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available <<< 32935 1726853716.00373: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # <<< 32935 1726853716.00394: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853716.00583: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853716.00619: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 32935 1726853716.00657: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a532e180> <<< 32935 1726853716.00677: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 32935 1726853716.00703: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 32935 1726853716.00890: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a532ede0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available <<< 32935 1726853716.01001: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853716.01082: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 32935 1726853716.01085: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853716.01230: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853716.01337: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 32935 1726853716.01396: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853716.01437: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853716.02080: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1a536a420> <<< 32935 1726853716.02112: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a535b260> import 'ansible.module_utils.facts.system.python' # <<< 32935 1726853716.02131: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853716.02173: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853716.02287: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available <<< 32935 1726853716.02598: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853716.02601: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 32935 1726853716.02692: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # <<< 32935 1726853716.03155: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.service_mgr' # <<< 32935 1726853716.03592: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1a537dcd0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a535b380> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available <<< 32935 1726853716.03849: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available <<< 32935 1726853716.04106: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 32935 1726853716.04143: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available <<< 32935 1726853716.04189: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853716.04350: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 32935 1726853716.04370: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 32935 1726853716.04395: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853716.04950: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853716.05447: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available <<< 32935 1726853716.05552: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853716.05670: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available <<< 32935 1726853716.05759: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853716.05857: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 32935 1726853716.05878: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853716.06013: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853716.06157: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 32935 1726853716.06191: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 32935 1726853716.06238: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853716.06260: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853716.06341: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available <<< 32935 1726853716.06388: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853716.06551: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853716.06905: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853716.07025: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available <<< 32935 1726853716.07102: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available <<< 32935 1726853716.07133: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853716.07161: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # <<< 32935 1726853716.07484: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 32935 1726853716.07515: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available <<< 32935 1726853716.07518: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853716.07624: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 32935 1726853716.07628: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853716.07681: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853716.08142: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available <<< 32935 1726853716.08239: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853716.08574: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 32935 1726853716.08578: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853716.08706: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853716.08742: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 32935 1726853716.08764: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853716.08793: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853716.08928: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 32935 1726853716.08942: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available <<< 32935 1726853716.08978: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853716.09024: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # <<< 32935 1726853716.09045: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853716.09249: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853716.09297: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # <<< 32935 1726853716.09362: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853716.09416: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available <<< 32935 1726853716.09445: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853716.09533: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853716.09546: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853716.09601: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853716.09755: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853716.09879: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available <<< 32935 1726853716.09899: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853716.09987: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 32935 1726853716.10128: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853716.10338: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 32935 1726853716.10342: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853716.10370: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853716.10427: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available <<< 32935 1726853716.10467: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853716.10518: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available <<< 32935 1726853716.10623: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853716.10745: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # <<< 32935 1726853716.10748: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853716.10783: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853716.10880: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 32935 1726853716.10978: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853716.11541: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 32935 1726853716.11554: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1a5116a80> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a5115730> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a510e180> <<< 32935 1726853716.34961: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' <<< 32935 1726853716.34968: stdout chunk (state=3): >>>import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a515ce90> <<< 32935 1726853716.35024: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a515cce0> <<< 32935 1726853716.35109: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py <<< 32935 1726853716.35117: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' <<< 32935 1726853716.35295: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a515e2d0> <<< 32935 1726853716.35300: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a515dd90> <<< 32935 1726853716.35386: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 32935 1726853716.60985: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCXYnrsBaUY4i0/t1QUWoZkFPnFbGncbkmF01/zUZNuwldCwqYoDxpR2K8ARraEuK9oVLyYO0alCszdGP42db6R4xfRCOhN3faeZXsneupsJk4LLpIBkq0uIokeAtcL1tPOUQQzfsQqzZzp4BmJCVrwmUW5ADnzqCgvB3gsruyTQUrEUJ9MtB5zdaQm5MXuipjeZQThTjYCB2aXv/qTdzfKAwds3CoSZ6HA5GNdi6tahsy3CRIn6VtVkvwrqjJcwo+RaRQzjh+C9AUoH2YQmfLbvog62MsnLk/5OPq5HhxO81pm/TJDsI4LXwLh1VviMOWzVvIaXuKwdmYAdgX1NU561bBzeYzi55qBKo4TcMmnOXiV+Es7dDrKjwwpQKsv5tjSqVkcO6ek3I6SI38DXFOBLZtqXOOLsO12iOReYJUWe/+cgBrz12kDCPmaHFzFFZ3+N0GQ/WiYcgqiUItIhb3xJTbPqls0czPCpCzKo57GyTmv17fpfGhBfRoGX/H1zYs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDOnt+7F+RlMaCRRXs8YyuhiHP1FmeDlj4rNB/K2mg1iP9loXXc/XjJ083xMBDu7m7uYLGB+dnmj299Y+RcAQpE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKmxoIJtw8UORlc+o+Q7Pks5ERSyhMLnl+Oo8W221WGj", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_apparmor": {"status": "disabled"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_lsb": {}, "ansible_is_chroot": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-153", "ansible_nodename": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26b9e88796a7cb9ebdc2656ce384f6", "ansible_loadavg": {"1m": 0.5537109375, "5m": 0.4267578125, "15m": 0.24609375}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2932, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 599, "free": 2932}, "nocache": {"free": 3291, "used": 240}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_uuid": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 882, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261771128832, "block_size": 4096, "block_total": 65519099, "block_available": 63908967, "block_used": 1610132, "inode_total": 131070960, "inode_available": 131028923, "inode_used": 42037, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_iscsi_iqn": "", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::3a:e7ff:fe40:bc9f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.153"], "ansible_all_ipv6_addresses": ["fe80::3a:e7ff:fe40:bc9f"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.153", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::3a:e7ff:fe40:bc9f"]}, "ansible_local": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_pkg_mgr": "dnf", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "35", "second": "16", "epoch": "1726853716", "epoch_int": "1726853716", "date": "2024-09-20", "time": "13:35:16", "iso8601_micro": "2024-09-20T17:35:16.602234Z", "iso8601": "2024-09-20T17:35:16Z", "iso8601_basic": "20240920T133516602234", "iso8601_basic_short": "20240920T133516", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_fips": false, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 44238 10.31.45.153 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 44238 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_fibre_channel_wwn": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 32935 1726853716.61311: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks <<< 32935 1726853716.61368: stdout chunk (state=3): >>># clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re <<< 32935 1726853716.61651: stdout chunk (state=3): >>># cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 32935 1726853716.62053: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 32935 1726853716.62090: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 32935 1726853716.62376: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath <<< 32935 1726853716.62451: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl <<< 32935 1726853716.62476: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json <<< 32935 1726853716.62552: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection <<< 32935 1726853716.62815: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize <<< 32935 1726853716.62818: stdout chunk (state=3): >>># cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 32935 1726853716.62944: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 32935 1726853716.62985: stdout chunk (state=3): >>># destroy _collections <<< 32935 1726853716.63005: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 32935 1726853716.63051: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize <<< 32935 1726853716.63082: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 32935 1726853716.63316: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 32935 1726853716.63877: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 32935 1726853716.63890: stdout chunk (state=3): >>><<< 32935 1726853716.63915: stderr chunk (state=3): >>><<< 32935 1726853716.64087: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a63104d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a62dfb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a6312a50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a60c1130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a60c1fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a60ffe00> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a60ffec0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a6137830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a6137ec0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a6117ad0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a61151f0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a60fcfb0> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a6157740> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a6156360> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a61160c0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a6154bf0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a618c7d0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a60fc230> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1a618cc80> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a618cb30> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1a618cf20> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a60fad50> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a618d610> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a618d2e0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a618e510> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a61a46e0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1a61a5df0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a61a6c90> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1a61a72f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a61a61e0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1a61a7d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a61a74a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a618e4b0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1a5e9bc80> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1a5ec47d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a5ec4530> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1a5ec4740> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1a5ec5100> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1a5ec5ac0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a5ec49b0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a5e99e20> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a5ec6ea0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a5ec5be0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a618ec00> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a5eef230> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a5f135c0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a5f743e0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a5f76b40> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a5f74500> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a5f353d0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a5d7d460> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a5f123c0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a5ec7e00> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7ff1a5f129c0> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_wepwvlc8/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a5de31d0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a5dc20c0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a5dc1250> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a5de0f20> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1a5e12bd0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a5e12960> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a5e122a0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a5e12cf0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a5de3e60> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1a5e138f0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1a5e13b30> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a5e13ef0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a5725d30> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1a5727950> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a5728350> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a5729220> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a572bfb0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1a57300e0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a572a270> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a5733f20> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a57329f0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a5732750> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a5732cc0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a572a780> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1a5777f20> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a5778140> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1a5779d00> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a5779ac0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1a577c260> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a577a3c0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a577fa40> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a577c410> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1a5780ad0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1a5780a70> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1a5780d40> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a5778410> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1a5608380> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1a56094c0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a5782b10> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1a5783ec0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a5782780> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1a5611700> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a56123f0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a56095e0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a5612570> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a5613650> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1a561e270> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a56198b0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a5706930> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a57fe600> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a561e000> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a5616630> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a56b2150> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a52c3fe0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1a52dc380> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a569b020> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a56b2c90> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a56b0830> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a56b04a0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1a52df350> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a52dec00> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1a52dede0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a52de030> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a52df440> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1a532df70> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a52dff50> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a56b0530> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a532e180> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a532ede0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1a536a420> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a535b260> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1a537dcd0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a535b380> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff1a5116a80> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a5115730> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a510e180> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a515ce90> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a515cce0> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a515e2d0> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff1a515dd90> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCXYnrsBaUY4i0/t1QUWoZkFPnFbGncbkmF01/zUZNuwldCwqYoDxpR2K8ARraEuK9oVLyYO0alCszdGP42db6R4xfRCOhN3faeZXsneupsJk4LLpIBkq0uIokeAtcL1tPOUQQzfsQqzZzp4BmJCVrwmUW5ADnzqCgvB3gsruyTQUrEUJ9MtB5zdaQm5MXuipjeZQThTjYCB2aXv/qTdzfKAwds3CoSZ6HA5GNdi6tahsy3CRIn6VtVkvwrqjJcwo+RaRQzjh+C9AUoH2YQmfLbvog62MsnLk/5OPq5HhxO81pm/TJDsI4LXwLh1VviMOWzVvIaXuKwdmYAdgX1NU561bBzeYzi55qBKo4TcMmnOXiV+Es7dDrKjwwpQKsv5tjSqVkcO6ek3I6SI38DXFOBLZtqXOOLsO12iOReYJUWe/+cgBrz12kDCPmaHFzFFZ3+N0GQ/WiYcgqiUItIhb3xJTbPqls0czPCpCzKo57GyTmv17fpfGhBfRoGX/H1zYs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDOnt+7F+RlMaCRRXs8YyuhiHP1FmeDlj4rNB/K2mg1iP9loXXc/XjJ083xMBDu7m7uYLGB+dnmj299Y+RcAQpE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKmxoIJtw8UORlc+o+Q7Pks5ERSyhMLnl+Oo8W221WGj", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_apparmor": {"status": "disabled"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_lsb": {}, "ansible_is_chroot": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-153", "ansible_nodename": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26b9e88796a7cb9ebdc2656ce384f6", "ansible_loadavg": {"1m": 0.5537109375, "5m": 0.4267578125, "15m": 0.24609375}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2932, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 599, "free": 2932}, "nocache": {"free": 3291, "used": 240}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_uuid": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 882, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261771128832, "block_size": 4096, "block_total": 65519099, "block_available": 63908967, "block_used": 1610132, "inode_total": 131070960, "inode_available": 131028923, "inode_used": 42037, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_iscsi_iqn": "", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::3a:e7ff:fe40:bc9f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.153"], "ansible_all_ipv6_addresses": ["fe80::3a:e7ff:fe40:bc9f"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.153", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::3a:e7ff:fe40:bc9f"]}, "ansible_local": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_pkg_mgr": "dnf", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "35", "second": "16", "epoch": "1726853716", "epoch_int": "1726853716", "date": "2024-09-20", "time": "13:35:16", "iso8601_micro": "2024-09-20T17:35:16.602234Z", "iso8601": "2024-09-20T17:35:16Z", "iso8601_basic": "20240920T133516602234", "iso8601_basic_short": "20240920T133516", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_fips": false, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 44238 10.31.45.153 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 44238 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_fibre_channel_wwn": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed_node1 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 32935 1726853716.66479: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853714.954953-32973-57050967083471/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32935 1726853716.66482: _low_level_execute_command(): starting 32935 1726853716.66485: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853714.954953-32973-57050967083471/ > /dev/null 2>&1 && sleep 0' 32935 1726853716.66846: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853716.66991: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853716.67035: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853716.67138: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853716.69831: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853716.69837: stdout chunk (state=3): >>><<< 32935 1726853716.69839: stderr chunk (state=3): >>><<< 32935 1726853716.70078: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853716.70082: handler run complete 32935 1726853716.70085: variable 'ansible_facts' from source: unknown 32935 1726853716.70197: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853716.71728: variable 'ansible_facts' from source: unknown 32935 1726853716.71982: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853716.72477: attempt loop complete, returning result 32935 1726853716.72481: _execute() done 32935 1726853716.72484: dumping result to json 32935 1726853716.72486: done dumping result, returning 32935 1726853716.72489: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [02083763-bbaf-84df-441d-0000000000af] 32935 1726853716.72491: sending task result for task 02083763-bbaf-84df-441d-0000000000af 32935 1726853716.73177: done sending task result for task 02083763-bbaf-84df-441d-0000000000af 32935 1726853716.73185: WORKER PROCESS EXITING ok: [managed_node1] 32935 1726853716.73661: no more pending results, returning what we have 32935 1726853716.73664: results queue empty 32935 1726853716.73664: checking for any_errors_fatal 32935 1726853716.73665: done checking for any_errors_fatal 32935 1726853716.73666: checking for max_fail_percentage 32935 1726853716.73668: done checking for max_fail_percentage 32935 1726853716.73668: checking to see if all hosts have failed and the running result is not ok 32935 1726853716.73669: done checking to see if all hosts have failed 32935 1726853716.73670: getting the remaining hosts for this loop 32935 1726853716.73673: done getting the remaining hosts for this loop 32935 1726853716.73676: getting the next task for host managed_node1 32935 1726853716.73683: done getting next task for host managed_node1 32935 1726853716.73684: ^ task is: TASK: meta (flush_handlers) 32935 1726853716.73686: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853716.73689: getting variables 32935 1726853716.73690: in VariableManager get_vars() 32935 1726853716.73712: Calling all_inventory to load vars for managed_node1 32935 1726853716.73714: Calling groups_inventory to load vars for managed_node1 32935 1726853716.73717: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853716.73739: Calling all_plugins_play to load vars for managed_node1 32935 1726853716.73742: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853716.73745: Calling groups_plugins_play to load vars for managed_node1 32935 1726853716.73917: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853716.74255: done with get_vars() 32935 1726853716.74266: done getting variables 32935 1726853716.74338: in VariableManager get_vars() 32935 1726853716.74347: Calling all_inventory to load vars for managed_node1 32935 1726853716.74349: Calling groups_inventory to load vars for managed_node1 32935 1726853716.74351: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853716.74355: Calling all_plugins_play to load vars for managed_node1 32935 1726853716.74357: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853716.74360: Calling groups_plugins_play to load vars for managed_node1 32935 1726853716.74511: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853716.74703: done with get_vars() 32935 1726853716.74726: done queuing things up, now waiting for results queue to drain 32935 1726853716.74728: results queue empty 32935 1726853716.74729: checking for any_errors_fatal 32935 1726853716.74731: done checking for any_errors_fatal 32935 1726853716.74731: checking for max_fail_percentage 32935 1726853716.74732: done checking for max_fail_percentage 32935 1726853716.74733: checking to see if all hosts have failed and the running result is not ok 32935 1726853716.74734: done checking to see if all hosts have failed 32935 1726853716.74738: getting the remaining hosts for this loop 32935 1726853716.74739: done getting the remaining hosts for this loop 32935 1726853716.74742: getting the next task for host managed_node1 32935 1726853716.74746: done getting next task for host managed_node1 32935 1726853716.74748: ^ task is: TASK: Include the task 'el_repo_setup.yml' 32935 1726853716.74749: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853716.74751: getting variables 32935 1726853716.74752: in VariableManager get_vars() 32935 1726853716.74759: Calling all_inventory to load vars for managed_node1 32935 1726853716.74761: Calling groups_inventory to load vars for managed_node1 32935 1726853716.74764: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853716.74768: Calling all_plugins_play to load vars for managed_node1 32935 1726853716.74770: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853716.74775: Calling groups_plugins_play to load vars for managed_node1 32935 1726853716.74901: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853716.75044: done with get_vars() 32935 1726853716.75050: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_vlan_mtu_nm.yml:11 Friday 20 September 2024 13:35:16 -0400 (0:00:01.869) 0:00:01.886 ****** 32935 1726853716.75110: entering _queue_task() for managed_node1/include_tasks 32935 1726853716.75112: Creating lock for include_tasks 32935 1726853716.75359: worker is 1 (out of 1 available) 32935 1726853716.75373: exiting _queue_task() for managed_node1/include_tasks 32935 1726853716.75384: done queuing things up, now waiting for results queue to drain 32935 1726853716.75386: waiting for pending results... 32935 1726853716.75708: running TaskExecutor() for managed_node1/TASK: Include the task 'el_repo_setup.yml' 32935 1726853716.75713: in run() - task 02083763-bbaf-84df-441d-000000000006 32935 1726853716.75716: variable 'ansible_search_path' from source: unknown 32935 1726853716.75744: calling self._execute() 32935 1726853716.75818: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853716.75838: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853716.75853: variable 'omit' from source: magic vars 32935 1726853716.75973: _execute() done 32935 1726853716.75984: dumping result to json 32935 1726853716.75991: done dumping result, returning 32935 1726853716.76001: done running TaskExecutor() for managed_node1/TASK: Include the task 'el_repo_setup.yml' [02083763-bbaf-84df-441d-000000000006] 32935 1726853716.76009: sending task result for task 02083763-bbaf-84df-441d-000000000006 32935 1726853716.76206: no more pending results, returning what we have 32935 1726853716.76212: in VariableManager get_vars() 32935 1726853716.76245: Calling all_inventory to load vars for managed_node1 32935 1726853716.76248: Calling groups_inventory to load vars for managed_node1 32935 1726853716.76252: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853716.76291: Calling all_plugins_play to load vars for managed_node1 32935 1726853716.76294: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853716.76297: Calling groups_plugins_play to load vars for managed_node1 32935 1726853716.76452: done sending task result for task 02083763-bbaf-84df-441d-000000000006 32935 1726853716.76480: WORKER PROCESS EXITING 32935 1726853716.76492: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853716.76617: done with get_vars() 32935 1726853716.76622: variable 'ansible_search_path' from source: unknown 32935 1726853716.76632: we have included files to process 32935 1726853716.76632: generating all_blocks data 32935 1726853716.76633: done generating all_blocks data 32935 1726853716.76634: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 32935 1726853716.76635: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 32935 1726853716.76637: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 32935 1726853716.77250: in VariableManager get_vars() 32935 1726853716.77264: done with get_vars() 32935 1726853716.77278: done processing included file 32935 1726853716.77279: iterating over new_blocks loaded from include file 32935 1726853716.77281: in VariableManager get_vars() 32935 1726853716.77289: done with get_vars() 32935 1726853716.77290: filtering new block on tags 32935 1726853716.77304: done filtering new block on tags 32935 1726853716.77307: in VariableManager get_vars() 32935 1726853716.77317: done with get_vars() 32935 1726853716.77318: filtering new block on tags 32935 1726853716.77332: done filtering new block on tags 32935 1726853716.77334: in VariableManager get_vars() 32935 1726853716.77343: done with get_vars() 32935 1726853716.77344: filtering new block on tags 32935 1726853716.77355: done filtering new block on tags 32935 1726853716.77357: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed_node1 32935 1726853716.77362: extending task lists for all hosts with included blocks 32935 1726853716.77408: done extending task lists 32935 1726853716.77409: done processing included files 32935 1726853716.77410: results queue empty 32935 1726853716.77410: checking for any_errors_fatal 32935 1726853716.77412: done checking for any_errors_fatal 32935 1726853716.77412: checking for max_fail_percentage 32935 1726853716.77413: done checking for max_fail_percentage 32935 1726853716.77414: checking to see if all hosts have failed and the running result is not ok 32935 1726853716.77415: done checking to see if all hosts have failed 32935 1726853716.77415: getting the remaining hosts for this loop 32935 1726853716.77416: done getting the remaining hosts for this loop 32935 1726853716.77419: getting the next task for host managed_node1 32935 1726853716.77422: done getting next task for host managed_node1 32935 1726853716.77424: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 32935 1726853716.77427: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853716.77429: getting variables 32935 1726853716.77430: in VariableManager get_vars() 32935 1726853716.77438: Calling all_inventory to load vars for managed_node1 32935 1726853716.77440: Calling groups_inventory to load vars for managed_node1 32935 1726853716.77442: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853716.77447: Calling all_plugins_play to load vars for managed_node1 32935 1726853716.77449: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853716.77452: Calling groups_plugins_play to load vars for managed_node1 32935 1726853716.77599: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853716.77766: done with get_vars() 32935 1726853716.77776: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Friday 20 September 2024 13:35:16 -0400 (0:00:00.027) 0:00:01.913 ****** 32935 1726853716.77835: entering _queue_task() for managed_node1/setup 32935 1726853716.78038: worker is 1 (out of 1 available) 32935 1726853716.78050: exiting _queue_task() for managed_node1/setup 32935 1726853716.78060: done queuing things up, now waiting for results queue to drain 32935 1726853716.78062: waiting for pending results... 32935 1726853716.78215: running TaskExecutor() for managed_node1/TASK: Gather the minimum subset of ansible_facts required by the network role test 32935 1726853716.78287: in run() - task 02083763-bbaf-84df-441d-0000000000c0 32935 1726853716.78295: variable 'ansible_search_path' from source: unknown 32935 1726853716.78298: variable 'ansible_search_path' from source: unknown 32935 1726853716.78321: calling self._execute() 32935 1726853716.78374: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853716.78378: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853716.78393: variable 'omit' from source: magic vars 32935 1726853716.78877: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32935 1726853716.80798: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32935 1726853716.80898: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32935 1726853716.80968: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32935 1726853716.80980: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32935 1726853716.80987: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32935 1726853716.81050: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32935 1726853716.81073: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32935 1726853716.81096: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853716.81122: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32935 1726853716.81133: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32935 1726853716.81476: variable 'ansible_facts' from source: unknown 32935 1726853716.81480: variable 'network_test_required_facts' from source: task vars 32935 1726853716.81483: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 32935 1726853716.81485: variable 'omit' from source: magic vars 32935 1726853716.81488: variable 'omit' from source: magic vars 32935 1726853716.81507: variable 'omit' from source: magic vars 32935 1726853716.81639: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32935 1726853716.81642: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32935 1726853716.81644: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32935 1726853716.81646: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853716.81648: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853716.81651: variable 'inventory_hostname' from source: host vars for 'managed_node1' 32935 1726853716.81666: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853716.81692: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853716.81751: Set connection var ansible_timeout to 10 32935 1726853716.81755: Set connection var ansible_shell_type to sh 32935 1726853716.81763: Set connection var ansible_pipelining to False 32935 1726853716.81766: Set connection var ansible_connection to ssh 32935 1726853716.81770: Set connection var ansible_shell_executable to /bin/sh 32935 1726853716.81777: Set connection var ansible_module_compression to ZIP_DEFLATED 32935 1726853716.81876: variable 'ansible_shell_executable' from source: unknown 32935 1726853716.81879: variable 'ansible_connection' from source: unknown 32935 1726853716.81881: variable 'ansible_module_compression' from source: unknown 32935 1726853716.81884: variable 'ansible_shell_type' from source: unknown 32935 1726853716.81886: variable 'ansible_shell_executable' from source: unknown 32935 1726853716.81888: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853716.81891: variable 'ansible_pipelining' from source: unknown 32935 1726853716.81893: variable 'ansible_timeout' from source: unknown 32935 1726853716.81895: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853716.82012: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 32935 1726853716.82030: variable 'omit' from source: magic vars 32935 1726853716.82034: starting attempt loop 32935 1726853716.82036: running the handler 32935 1726853716.82078: _low_level_execute_command(): starting 32935 1726853716.82082: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32935 1726853716.82788: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32935 1726853716.82803: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853716.82820: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853716.82839: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32935 1726853716.82949: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853716.83029: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853716.83110: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853716.85607: stdout chunk (state=3): >>>/root <<< 32935 1726853716.85651: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853716.85691: stderr chunk (state=3): >>><<< 32935 1726853716.85695: stdout chunk (state=3): >>><<< 32935 1726853716.85712: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853716.85726: _low_level_execute_command(): starting 32935 1726853716.85731: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853716.857133-33065-198919573009519 `" && echo ansible-tmp-1726853716.857133-33065-198919573009519="` echo /root/.ansible/tmp/ansible-tmp-1726853716.857133-33065-198919573009519 `" ) && sleep 0' 32935 1726853716.86335: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32935 1726853716.86339: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 32935 1726853716.86341: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853716.86344: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853716.86510: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853716.86532: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853716.86602: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853716.89156: stdout chunk (state=3): >>>ansible-tmp-1726853716.857133-33065-198919573009519=/root/.ansible/tmp/ansible-tmp-1726853716.857133-33065-198919573009519 <<< 32935 1726853716.89299: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853716.89326: stderr chunk (state=3): >>><<< 32935 1726853716.89329: stdout chunk (state=3): >>><<< 32935 1726853716.89346: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853716.857133-33065-198919573009519=/root/.ansible/tmp/ansible-tmp-1726853716.857133-33065-198919573009519 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853716.89401: variable 'ansible_module_compression' from source: unknown 32935 1726853716.89483: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-32935vj31k4ae/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 32935 1726853716.89550: variable 'ansible_facts' from source: unknown 32935 1726853716.89923: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853716.857133-33065-198919573009519/AnsiballZ_setup.py 32935 1726853716.90053: Sending initial data 32935 1726853716.90068: Sent initial data (153 bytes) 32935 1726853716.91030: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853716.91076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853716.91174: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853716.91265: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853716.92765: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 32935 1726853716.92774: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32935 1726853716.92814: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32935 1726853716.92855: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32935vj31k4ae/tmpdubn_a25 /root/.ansible/tmp/ansible-tmp-1726853716.857133-33065-198919573009519/AnsiballZ_setup.py <<< 32935 1726853716.92859: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853716.857133-33065-198919573009519/AnsiballZ_setup.py" <<< 32935 1726853716.92940: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-32935vj31k4ae/tmpdubn_a25" to remote "/root/.ansible/tmp/ansible-tmp-1726853716.857133-33065-198919573009519/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853716.857133-33065-198919573009519/AnsiballZ_setup.py" <<< 32935 1726853716.95056: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853716.95060: stderr chunk (state=3): >>><<< 32935 1726853716.95069: stdout chunk (state=3): >>><<< 32935 1726853716.95101: done transferring module to remote 32935 1726853716.95114: _low_level_execute_command(): starting 32935 1726853716.95117: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853716.857133-33065-198919573009519/ /root/.ansible/tmp/ansible-tmp-1726853716.857133-33065-198919573009519/AnsiballZ_setup.py && sleep 0' 32935 1726853716.96475: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32935 1726853716.96492: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853716.96581: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853716.96596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853716.96613: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853716.96638: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853716.96654: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853716.96747: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 32935 1726853716.99080: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853716.99101: stdout chunk (state=3): >>><<< 32935 1726853716.99222: stderr chunk (state=3): >>><<< 32935 1726853716.99226: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 32935 1726853716.99229: _low_level_execute_command(): starting 32935 1726853716.99232: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853716.857133-33065-198919573009519/AnsiballZ_setup.py && sleep 0' 32935 1726853717.00302: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853717.00306: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 32935 1726853717.00308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853717.00310: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853717.00312: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853717.00496: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853717.00569: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853717.00586: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853717.00798: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853717.02977: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin <<< 32935 1726853717.03288: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # <<< 32935 1726853717.03710: stdout chunk (state=3): >>>import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7fc184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7fbe7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7fc1aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # <<< 32935 1726853717.03728: stdout chunk (state=3): >>>import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7fa2d130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 32935 1726853717.03755: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7fa2dfa0> import 'site' # <<< 32935 1726853717.03786: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 32935 1726853717.04160: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 32935 1726853717.04191: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 32935 1726853717.04215: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 32935 1726853717.04274: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 32935 1726853717.04287: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 32935 1726853717.04312: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7fa6bec0> <<< 32935 1726853717.04341: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 32935 1726853717.04491: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7fa6bf80> <<< 32935 1726853717.04597: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7faa3830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7faa3ec0> import '_collections' # <<< 32935 1726853717.04626: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7fa83b60> <<< 32935 1726853717.04747: stdout chunk (state=3): >>>import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7fa812b0> <<< 32935 1726853717.04807: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7fa69070> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 32935 1726853717.04853: stdout chunk (state=3): >>>import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 32935 1726853717.04895: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 32935 1726853717.05009: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7fac37d0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7fac23f0> <<< 32935 1726853717.05039: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7fa82150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7fac0bc0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7faf8890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7fa682f0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 32935 1726853717.05076: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbe7faf8d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7faf8bf0> <<< 32935 1726853717.05140: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 32935 1726853717.05164: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbe7faf8fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7fa66e10> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 32935 1726853717.05284: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 32935 1726853717.05287: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7faf9670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7faf9370> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7fafa540> <<< 32935 1726853717.05307: stdout chunk (state=3): >>>import 'importlib.util' # import 'runpy' # <<< 32935 1726853717.05386: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7fb10740> <<< 32935 1726853717.05442: stdout chunk (state=3): >>>import 'errno' # <<< 32935 1726853717.05493: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbe7fb11e20> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 32935 1726853717.05565: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7fb12cc0> <<< 32935 1726853717.05666: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbe7fb132f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7fb12210> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 32935 1726853717.05700: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbe7fb13d70> <<< 32935 1726853717.05719: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7fb134a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7fafa4b0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 32935 1726853717.05742: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 32935 1726853717.05795: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' <<< 32935 1726853717.05901: stdout chunk (state=3): >>># extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbe7f80fbf0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbe7f838740> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f8384a0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbe7f838770> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 32935 1726853717.05954: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 32935 1726853717.06080: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbe7f8390a0> <<< 32935 1726853717.06213: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbe7f839a00> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f838950> <<< 32935 1726853717.06262: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f80ddc0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 32935 1726853717.06332: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 32935 1726853717.06379: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f83ade0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f839b20> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7fafa6f0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 32935 1726853717.06472: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 32935 1726853717.06481: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 32935 1726853717.06664: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f867140> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 32935 1726853717.06668: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 32935 1726853717.06670: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 32935 1726853717.06674: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 32935 1726853717.06703: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f887530> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 32935 1726853717.06817: stdout chunk (state=3): >>>import 'ntpath' # <<< 32935 1726853717.06824: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f8e82c0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 32935 1726853717.06934: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 32935 1726853717.06937: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 32935 1726853717.06999: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f8eaa20> <<< 32935 1726853717.07126: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f8e83e0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f8ad2b0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f6f1340> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f886330> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f83bd10> <<< 32935 1726853717.07538: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fbe7f886930> <<< 32935 1726853717.07573: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_ixitxj0x/ansible_setup_payload.zip' <<< 32935 1726853717.07596: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.07770: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 32935 1726853717.07842: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 32935 1726853717.08095: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f75b050> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f739f40> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f7390d0> # zipimport: zlib available <<< 32935 1726853717.08099: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available <<< 32935 1726853717.08123: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.08129: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.08147: stdout chunk (state=3): >>>import 'ansible.module_utils' # <<< 32935 1726853717.08150: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.09584: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.10650: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f7596d0> <<< 32935 1726853717.10679: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 32935 1726853717.10709: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 32935 1726853717.10726: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py <<< 32935 1726853717.10776: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' <<< 32935 1726853717.10885: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbe7f78a990> <<< 32935 1726853717.10899: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f78a720> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f78a030> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f78a510> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f838440> <<< 32935 1726853717.10912: stdout chunk (state=3): >>>import 'atexit' # <<< 32935 1726853717.10994: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbe7f78b650> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbe7f78b890> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 32935 1726853717.11086: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f78bd10> import 'pwd' # <<< 32935 1726853717.11110: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 32935 1726853717.11132: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 32935 1726853717.11164: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f12dac0> <<< 32935 1726853717.11222: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbe7f12f6e0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 32935 1726853717.11239: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 32935 1726853717.11273: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f12ffb0> <<< 32935 1726853717.11289: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 32935 1726853717.11335: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 32935 1726853717.11348: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f131220> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 32935 1726853717.11390: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 32935 1726853717.11429: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py <<< 32935 1726853717.11433: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 32935 1726853717.11600: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f133cb0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbe7f73b0e0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f131fa0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 32935 1726853717.11609: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 32935 1726853717.11710: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 32935 1726853717.11754: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py <<< 32935 1726853717.11765: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f13bbc0> import '_tokenize' # <<< 32935 1726853717.11839: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f13a6c0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f13a420> <<< 32935 1726853717.11861: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 32935 1726853717.11922: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f13a960> <<< 32935 1726853717.11946: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f1324b0> <<< 32935 1726853717.11982: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' <<< 32935 1726853717.12021: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbe7f17fe60> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f17fe90> <<< 32935 1726853717.12026: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 32935 1726853717.12063: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 32935 1726853717.12066: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 32935 1726853717.12124: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbe7f1819d0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f181790> <<< 32935 1726853717.12128: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 32935 1726853717.12155: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 32935 1726853717.12212: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 32935 1726853717.12215: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbe7f183f20> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f1820c0> <<< 32935 1726853717.12237: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 32935 1726853717.12267: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 32935 1726853717.12296: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 32935 1726853717.12346: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f187590> <<< 32935 1726853717.12467: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f183f50> <<< 32935 1726853717.12526: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 32935 1726853717.12562: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbe7f1886e0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbe7f188770> <<< 32935 1726853717.12624: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbe7f1889b0> <<< 32935 1726853717.12627: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f180050> <<< 32935 1726853717.12652: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 32935 1726853717.12693: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 32935 1726853717.12714: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 32935 1726853717.12733: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbe7f18bf50> <<< 32935 1726853717.12872: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 32935 1726853717.12883: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbe7f014fb0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f18a750> <<< 32935 1726853717.12927: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbe7f18baa0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f18a360> <<< 32935 1726853717.12966: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # <<< 32935 1726853717.12969: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.13053: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.13154: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # <<< 32935 1726853717.13199: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 32935 1726853717.13211: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.13316: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.13465: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.13995: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.14541: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 32935 1726853717.14544: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 32935 1726853717.14561: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 32935 1726853717.14607: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbe7f019100> <<< 32935 1726853717.14691: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 32935 1726853717.14715: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f019f10> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f017c50> <<< 32935 1726853717.14781: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available <<< 32935 1726853717.14809: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils._text' # <<< 32935 1726853717.14820: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.14972: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.15121: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f019ca0> <<< 32935 1726853717.15136: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.15584: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.16252: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.16306: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.16410: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 32935 1726853717.16416: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.16462: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.16508: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 32935 1726853717.16513: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.16699: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.16732: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 32935 1726853717.16750: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.16758: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.16780: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 32935 1726853717.16835: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.16877: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 32935 1726853717.16890: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.17250: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.17629: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 32935 1726853717.17694: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # <<< 32935 1726853717.17787: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f01b020> <<< 32935 1726853717.17909: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.17915: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.18010: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 32935 1726853717.18024: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 32935 1726853717.18030: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 32935 1726853717.18103: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.18108: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.18155: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 32935 1726853717.18213: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.18229: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.18288: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.18398: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.18450: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 32935 1726853717.18489: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 32935 1726853717.18583: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' <<< 32935 1726853717.18592: stdout chunk (state=3): >>># extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbe7f025ca0> <<< 32935 1726853717.18639: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f020a40> <<< 32935 1726853717.18835: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available <<< 32935 1726853717.18919: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.18976: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 32935 1726853717.18982: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 32935 1726853717.19061: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 32935 1726853717.19069: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 32935 1726853717.19128: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f10e5a0> <<< 32935 1726853717.19191: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f7b6270> <<< 32935 1726853717.19287: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f025a60> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f018c50> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available <<< 32935 1726853717.19312: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.19342: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 32935 1726853717.19416: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 32935 1726853717.19422: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.19451: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 32935 1726853717.19497: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.19566: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.19569: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.19610: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.19645: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.19679: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.19713: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.19752: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 32935 1726853717.19773: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.19829: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.19923: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.19949: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat.typing' # <<< 32935 1726853717.20000: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.20128: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.20498: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 32935 1726853717.20508: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 32935 1726853717.20536: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 32935 1726853717.20549: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 32935 1726853717.20570: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 32935 1726853717.20595: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f0b5940> <<< 32935 1726853717.20641: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 32935 1726853717.20645: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 32935 1726853717.20663: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 32935 1726853717.20747: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 32935 1726853717.20792: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7ec43c20> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 32935 1726853717.20854: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbe7ec43f80> <<< 32935 1726853717.20891: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f09ebd0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f0b64e0> <<< 32935 1726853717.20924: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f0b5310> <<< 32935 1726853717.20966: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f0b7aa0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 32935 1726853717.21015: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 32935 1726853717.21088: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 32935 1726853717.21108: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 32935 1726853717.21111: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbe7ec56f90> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7ec56870> <<< 32935 1726853717.21153: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbe7ec56a50> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7ec55cd0> <<< 32935 1726853717.21230: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 32935 1726853717.21333: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 32935 1726853717.21372: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7ec57140> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 32935 1726853717.21433: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbe7ecadc40> <<< 32935 1726853717.21450: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7ec57c20> <<< 32935 1726853717.21501: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f0b50d0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # <<< 32935 1726853717.21533: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 32935 1726853717.21535: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other' # <<< 32935 1726853717.21603: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.21627: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.21691: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 32935 1726853717.21738: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.21777: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.21848: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 32935 1726853717.21853: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.21898: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available <<< 32935 1726853717.21964: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # <<< 32935 1726853717.21972: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.22037: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.22091: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # <<< 32935 1726853717.22120: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.22177: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.22227: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available <<< 32935 1726853717.22312: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.22407: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.22465: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.22540: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # <<< 32935 1726853717.22650: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.23318: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.23988: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available <<< 32935 1726853717.23993: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available <<< 32935 1726853717.24066: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.24069: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 32935 1726853717.24074: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.24109: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.24178: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available <<< 32935 1726853717.24207: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.24250: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 32935 1726853717.24254: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.24288: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.24312: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available <<< 32935 1726853717.24392: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.24487: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 32935 1726853717.24491: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 32935 1726853717.24855: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7ecafc80> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7ecae6f0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available <<< 32935 1726853717.24934: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 32935 1726853717.24948: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.25094: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.25202: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 32935 1726853717.25281: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.25303: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.25402: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 32935 1726853717.25413: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.25501: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.25562: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 32935 1726853717.25583: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 32935 1726853717.25697: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 32935 1726853717.25758: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbe7ecf1e50> <<< 32935 1726853717.26078: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7ecd1b80> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available<<< 32935 1726853717.26168: stdout chunk (state=3): >>> # zipimport: zlib available<<< 32935 1726853717.26177: stdout chunk (state=3): >>> <<< 32935 1726853717.26264: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 32935 1726853717.26270: stdout chunk (state=3): >>> <<< 32935 1726853717.26309: stdout chunk (state=3): >>># zipimport: zlib available<<< 32935 1726853717.26314: stdout chunk (state=3): >>> <<< 32935 1726853717.26447: stdout chunk (state=3): >>># zipimport: zlib available<<< 32935 1726853717.26453: stdout chunk (state=3): >>> <<< 32935 1726853717.26604: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.26791: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.27035: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available<<< 32935 1726853717.27114: stdout chunk (state=3): >>> # zipimport: zlib available<<< 32935 1726853717.27119: stdout chunk (state=3): >>> <<< 32935 1726853717.27210: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 32935 1726853717.27213: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.27276: stdout chunk (state=3): >>># zipimport: zlib available<<< 32935 1726853717.27352: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py <<< 32935 1726853717.27375: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 32935 1726853717.27425: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so'<<< 32935 1726853717.27430: stdout chunk (state=3): >>> <<< 32935 1726853717.27462: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 32935 1726853717.27486: stdout chunk (state=3): >>>import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbe7ecf9910> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7ecd2f00><<< 32935 1726853717.27524: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available<<< 32935 1726853717.27530: stdout chunk (state=3): >>> <<< 32935 1726853717.27563: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware' # <<< 32935 1726853717.27595: stdout chunk (state=3): >>> # zipimport: zlib available<<< 32935 1726853717.27652: stdout chunk (state=3): >>> # zipimport: zlib available<<< 32935 1726853717.27666: stdout chunk (state=3): >>> <<< 32935 1726853717.27735: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # <<< 32935 1726853717.27754: stdout chunk (state=3): >>> <<< 32935 1726853717.27783: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.28021: stdout chunk (state=3): >>># zipimport: zlib available<<< 32935 1726853717.28101: stdout chunk (state=3): >>> <<< 32935 1726853717.28304: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 32935 1726853717.28328: stdout chunk (state=3): >>> # zipimport: zlib available<<< 32935 1726853717.28341: stdout chunk (state=3): >>> <<< 32935 1726853717.28493: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.28665: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.28739: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.28804: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # <<< 32935 1726853717.28826: stdout chunk (state=3): >>> # zipimport: zlib available<<< 32935 1726853717.28859: stdout chunk (state=3): >>> # zipimport: zlib available<<< 32935 1726853717.28889: stdout chunk (state=3): >>> <<< 32935 1726853717.28914: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.29211: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.29357: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # <<< 32935 1726853717.29384: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 32935 1726853717.29427: stdout chunk (state=3): >>> <<< 32935 1726853717.29432: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.29612: stdout chunk (state=3): >>># zipimport: zlib available<<< 32935 1726853717.29618: stdout chunk (state=3): >>> <<< 32935 1726853717.29807: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 32935 1726853717.29847: stdout chunk (state=3): >>> # zipimport: zlib available<<< 32935 1726853717.29909: stdout chunk (state=3): >>> # zipimport: zlib available<<< 32935 1726853717.29925: stdout chunk (state=3): >>> <<< 32935 1726853717.29978: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.30887: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.31712: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # <<< 32935 1726853717.31739: stdout chunk (state=3): >>> # zipimport: zlib available<<< 32935 1726853717.31901: stdout chunk (state=3): >>> <<< 32935 1726853717.31907: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.32081: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 32935 1726853717.32112: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.32276: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.32419: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 32935 1726853717.32425: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.32911: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # <<< 32935 1726853717.32942: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 32935 1726853717.32949: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network' # <<< 32935 1726853717.32964: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.33077: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.base' # <<< 32935 1726853717.33081: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.33244: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.33385: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.33712: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.34032: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # <<< 32935 1726853717.34039: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.aix' # <<< 32935 1726853717.34053: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.34151: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # <<< 32935 1726853717.34161: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.34195: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.34225: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # <<< 32935 1726853717.34239: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.34341: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.34461: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available <<< 32935 1726853717.34502: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.34535: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available <<< 32935 1726853717.34687: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # <<< 32935 1726853717.34695: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.34780: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.34865: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 32935 1726853717.34876: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.35281: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.35704: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 32935 1726853717.35707: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.35795: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.35893: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 32935 1726853717.35897: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.35995: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available <<< 32935 1726853717.36030: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.36076: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 32935 1726853717.36089: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.36131: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.36188: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available <<< 32935 1726853717.36393: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.36419: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 32935 1726853717.36428: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.36452: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual' # <<< 32935 1726853717.36481: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.36539: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.36602: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # <<< 32935 1726853717.36627: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.36663: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.36681: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.36736: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.36803: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.36918: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.37016: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # <<< 32935 1726853717.37048: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available<<< 32935 1726853717.37056: stdout chunk (state=3): >>> <<< 32935 1726853717.37113: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.37194: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 32935 1726853717.37695: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.37811: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 32935 1726853717.37817: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.37884: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.37946: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 32935 1726853717.37966: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.38024: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.38093: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 32935 1726853717.38096: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.38218: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.38338: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # <<< 32935 1726853717.38348: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.default_collectors' # <<< 32935 1726853717.38360: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.38489: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.38627: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 32935 1726853717.38742: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.39096: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 32935 1726853717.39123: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 32935 1726853717.39141: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 32935 1726853717.39180: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbe7eaf7230> <<< 32935 1726853717.39209: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7eaf5970> <<< 32935 1726853717.39295: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7eaf7b30> <<< 32935 1726853717.40792: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 44238 10.31.45.153 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 44238 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCXYnrsBaUY4i0/t1QUWoZkFPnFbGncbkmF01/zUZNuwldCwqYoDxpR2K8ARraEuK9oVLyYO0alCszdGP42db6R4xfRCOhN3faeZXsneupsJk4LLpIBkq0uIokeAtcL1tPOUQQzfsQqzZzp4BmJCVrwmUW5ADnzqCgvB3gsruyTQUrEUJ9MtB5zdaQm5MXuipjeZQThTjYCB2aXv/qTdzfKAwds3CoSZ6HA5GNdi6tahsy3CRIn6VtVkvwrqjJcwo+RaRQzjh+C9AUoH2YQmfLbvog62MsnLk/5OPq5HhxO81pm/TJDsI4LXwLh1VviMOWzVvIaXuKwdmYAdgX1NU561bBzeYzi55qBKo4TcMmnOXiV+Es7dDrKjwwpQKsv5tjSqVkcO6ek3I6SI38DXFOBLZtqXOOLsO<<< 32935 1726853717.40818: stdout chunk (state=3): >>>12iOReYJUWe/+cgBrz12kDCPmaHFzFFZ3+N0GQ/WiYcgqiUItIhb3xJTbPqls0czPCpCzKo57GyTmv17fpfGhBfRoGX/H1zYs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDOnt+7F+RlMaCRRXs8YyuhiHP1FmeDlj4rNB/K2mg1iP9loXXc/XjJ083xMBDu7m7uYLGB+dnmj299Y+RcAQpE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKmxoIJtw8UORlc+o+Q7Pks5ERSyhMLnl+Oo8W221WGj", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_apparmor": {"status": "disabled"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-153", "ansible_nodename": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26b9e88796a7cb9ebdc2656ce384f6", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "35", "second": "17", "epoch": "1726853717", "epoch_int": "1726853717", "date": "2024-09-20", "time": "13:35:17", "iso8601_micro": "2024-09-20T17:35:17.396149Z", "iso8601": "2024-09-20T17:35:17Z", "iso8601_basic": "20240920T133517396149", "iso8601_basic_short": "20240920T133517", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_lsb": {}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_local": {}, "ansible_fips": false, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 32935 1726853717.41883: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env <<< 32935 1726853717.41898: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin <<< 32935 1726853717.41967: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd <<< 32935 1726853717.41976: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl <<< 32935 1726853717.41980: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos<<< 32935 1726853717.41982: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 32935 1726853717.42434: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 32935 1726853717.42438: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma <<< 32935 1726853717.42478: stdout chunk (state=3): >>># destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma <<< 32935 1726853717.42721: stdout chunk (state=3): >>># destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 32935 1726853717.42738: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector <<< 32935 1726853717.42750: stdout chunk (state=3): >>># destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array <<< 32935 1726853717.42784: stdout chunk (state=3): >>># destroy _compat_pickle <<< 32935 1726853717.42788: stdout chunk (state=3): >>># destroy _pickle # destroy queue # destroy _heapq # destroy _queue <<< 32935 1726853717.43003: stdout chunk (state=3): >>># destroy multiprocessing.process<<< 32935 1726853717.43054: stdout chunk (state=3): >>> # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser <<< 32935 1726853717.43118: stdout chunk (state=3): >>># cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc <<< 32935 1726853717.43151: stdout chunk (state=3): >>># cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time <<< 32935 1726853717.43182: stdout chunk (state=3): >>># cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys <<< 32935 1726853717.43193: stdout chunk (state=3): >>># cleanup[3] wiping builtins <<< 32935 1726853717.43197: stdout chunk (state=3): >>># destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 32935 1726853717.43599: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 32935 1726853717.43672: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases <<< 32935 1726853717.43681: stdout chunk (state=3): >>># destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections <<< 32935 1726853717.43686: stdout chunk (state=3): >>># destroy threading <<< 32935 1726853717.43698: stdout chunk (state=3): >>># destroy atexit # destroy _warnings # destroy math # destroy _bisect <<< 32935 1726853717.43705: stdout chunk (state=3): >>># destroy time <<< 32935 1726853717.43732: stdout chunk (state=3): >>># destroy _random # destroy _weakref <<< 32935 1726853717.43769: stdout chunk (state=3): >>># destroy _hashlib <<< 32935 1726853717.43780: stdout chunk (state=3): >>># destroy _operator # destroy _sre # destroy _string # destroy re <<< 32935 1726853717.43820: stdout chunk (state=3): >>># destroy itertools <<< 32935 1726853717.43823: stdout chunk (state=3): >>># destroy _abc <<< 32935 1726853717.43825: stdout chunk (state=3): >>># destroy posix # destroy _functools # destroy builtins # destroy _thread <<< 32935 1726853717.43847: stdout chunk (state=3): >>># clear sys.audit hooks <<< 32935 1726853717.44394: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 32935 1726853717.44428: stderr chunk (state=3): >>><<< 32935 1726853717.44431: stdout chunk (state=3): >>><<< 32935 1726853717.44536: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7fc184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7fbe7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7fc1aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7fa2d130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7fa2dfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7fa6bec0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7fa6bf80> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7faa3830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7faa3ec0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7fa83b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7fa812b0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7fa69070> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7fac37d0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7fac23f0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7fa82150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7fac0bc0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7faf8890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7fa682f0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbe7faf8d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7faf8bf0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbe7faf8fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7fa66e10> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7faf9670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7faf9370> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7fafa540> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7fb10740> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbe7fb11e20> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7fb12cc0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbe7fb132f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7fb12210> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbe7fb13d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7fb134a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7fafa4b0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbe7f80fbf0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbe7f838740> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f8384a0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbe7f838770> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbe7f8390a0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbe7f839a00> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f838950> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f80ddc0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f83ade0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f839b20> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7fafa6f0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f867140> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f887530> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f8e82c0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f8eaa20> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f8e83e0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f8ad2b0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f6f1340> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f886330> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f83bd10> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fbe7f886930> # zipimport: found 103 names in '/tmp/ansible_setup_payload_ixitxj0x/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f75b050> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f739f40> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f7390d0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f7596d0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbe7f78a990> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f78a720> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f78a030> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f78a510> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f838440> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbe7f78b650> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbe7f78b890> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f78bd10> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f12dac0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbe7f12f6e0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f12ffb0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f131220> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f133cb0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbe7f73b0e0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f131fa0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f13bbc0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f13a6c0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f13a420> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f13a960> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f1324b0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbe7f17fe60> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f17fe90> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbe7f1819d0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f181790> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbe7f183f20> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f1820c0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f187590> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f183f50> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbe7f1886e0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbe7f188770> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbe7f1889b0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f180050> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbe7f18bf50> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbe7f014fb0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f18a750> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbe7f18baa0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f18a360> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbe7f019100> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f019f10> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f017c50> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f019ca0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f01b020> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbe7f025ca0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f020a40> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f10e5a0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f7b6270> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f025a60> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f018c50> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f0b5940> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7ec43c20> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbe7ec43f80> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f09ebd0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f0b64e0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f0b5310> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f0b7aa0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbe7ec56f90> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7ec56870> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbe7ec56a50> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7ec55cd0> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7ec57140> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbe7ecadc40> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7ec57c20> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7f0b50d0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7ecafc80> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7ecae6f0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbe7ecf1e50> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7ecd1b80> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbe7ecf9910> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7ecd2f00> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbe7eaf7230> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7eaf5970> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbe7eaf7b30> {"ansible_facts": {"ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 44238 10.31.45.153 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 44238 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCXYnrsBaUY4i0/t1QUWoZkFPnFbGncbkmF01/zUZNuwldCwqYoDxpR2K8ARraEuK9oVLyYO0alCszdGP42db6R4xfRCOhN3faeZXsneupsJk4LLpIBkq0uIokeAtcL1tPOUQQzfsQqzZzp4BmJCVrwmUW5ADnzqCgvB3gsruyTQUrEUJ9MtB5zdaQm5MXuipjeZQThTjYCB2aXv/qTdzfKAwds3CoSZ6HA5GNdi6tahsy3CRIn6VtVkvwrqjJcwo+RaRQzjh+C9AUoH2YQmfLbvog62MsnLk/5OPq5HhxO81pm/TJDsI4LXwLh1VviMOWzVvIaXuKwdmYAdgX1NU561bBzeYzi55qBKo4TcMmnOXiV+Es7dDrKjwwpQKsv5tjSqVkcO6ek3I6SI38DXFOBLZtqXOOLsO12iOReYJUWe/+cgBrz12kDCPmaHFzFFZ3+N0GQ/WiYcgqiUItIhb3xJTbPqls0czPCpCzKo57GyTmv17fpfGhBfRoGX/H1zYs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDOnt+7F+RlMaCRRXs8YyuhiHP1FmeDlj4rNB/K2mg1iP9loXXc/XjJ083xMBDu7m7uYLGB+dnmj299Y+RcAQpE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKmxoIJtw8UORlc+o+Q7Pks5ERSyhMLnl+Oo8W221WGj", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_apparmor": {"status": "disabled"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-153", "ansible_nodename": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26b9e88796a7cb9ebdc2656ce384f6", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "35", "second": "17", "epoch": "1726853717", "epoch_int": "1726853717", "date": "2024-09-20", "time": "13:35:17", "iso8601_micro": "2024-09-20T17:35:17.396149Z", "iso8601": "2024-09-20T17:35:17Z", "iso8601_basic": "20240920T133517396149", "iso8601_basic_short": "20240920T133517", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_lsb": {}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_local": {}, "ansible_fips": false, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 32935 1726853717.45361: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853716.857133-33065-198919573009519/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32935 1726853717.45364: _low_level_execute_command(): starting 32935 1726853717.45367: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853716.857133-33065-198919573009519/ > /dev/null 2>&1 && sleep 0' 32935 1726853717.45369: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32935 1726853717.45373: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853717.45375: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853717.45407: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32935 1726853717.45410: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853717.45413: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853717.45415: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853717.45430: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853717.45476: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853717.45489: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853717.45544: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853717.48146: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853717.48176: stderr chunk (state=3): >>><<< 32935 1726853717.48179: stdout chunk (state=3): >>><<< 32935 1726853717.48195: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853717.48204: handler run complete 32935 1726853717.48232: variable 'ansible_facts' from source: unknown 32935 1726853717.48274: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853717.48345: variable 'ansible_facts' from source: unknown 32935 1726853717.48378: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853717.48411: attempt loop complete, returning result 32935 1726853717.48416: _execute() done 32935 1726853717.48418: dumping result to json 32935 1726853717.48428: done dumping result, returning 32935 1726853717.48435: done running TaskExecutor() for managed_node1/TASK: Gather the minimum subset of ansible_facts required by the network role test [02083763-bbaf-84df-441d-0000000000c0] 32935 1726853717.48438: sending task result for task 02083763-bbaf-84df-441d-0000000000c0 32935 1726853717.48568: done sending task result for task 02083763-bbaf-84df-441d-0000000000c0 32935 1726853717.48570: WORKER PROCESS EXITING ok: [managed_node1] 32935 1726853717.48668: no more pending results, returning what we have 32935 1726853717.48672: results queue empty 32935 1726853717.48673: checking for any_errors_fatal 32935 1726853717.48674: done checking for any_errors_fatal 32935 1726853717.48674: checking for max_fail_percentage 32935 1726853717.48676: done checking for max_fail_percentage 32935 1726853717.48677: checking to see if all hosts have failed and the running result is not ok 32935 1726853717.48678: done checking to see if all hosts have failed 32935 1726853717.48678: getting the remaining hosts for this loop 32935 1726853717.48687: done getting the remaining hosts for this loop 32935 1726853717.48691: getting the next task for host managed_node1 32935 1726853717.48699: done getting next task for host managed_node1 32935 1726853717.48701: ^ task is: TASK: Check if system is ostree 32935 1726853717.48704: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853717.48707: getting variables 32935 1726853717.48708: in VariableManager get_vars() 32935 1726853717.48732: Calling all_inventory to load vars for managed_node1 32935 1726853717.48734: Calling groups_inventory to load vars for managed_node1 32935 1726853717.48737: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853717.48746: Calling all_plugins_play to load vars for managed_node1 32935 1726853717.48748: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853717.48750: Calling groups_plugins_play to load vars for managed_node1 32935 1726853717.48891: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853717.49009: done with get_vars() 32935 1726853717.49018: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Friday 20 September 2024 13:35:17 -0400 (0:00:00.712) 0:00:02.626 ****** 32935 1726853717.49084: entering _queue_task() for managed_node1/stat 32935 1726853717.49291: worker is 1 (out of 1 available) 32935 1726853717.49302: exiting _queue_task() for managed_node1/stat 32935 1726853717.49313: done queuing things up, now waiting for results queue to drain 32935 1726853717.49315: waiting for pending results... 32935 1726853717.49455: running TaskExecutor() for managed_node1/TASK: Check if system is ostree 32935 1726853717.49522: in run() - task 02083763-bbaf-84df-441d-0000000000c2 32935 1726853717.49531: variable 'ansible_search_path' from source: unknown 32935 1726853717.49537: variable 'ansible_search_path' from source: unknown 32935 1726853717.49564: calling self._execute() 32935 1726853717.49622: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853717.49625: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853717.49634: variable 'omit' from source: magic vars 32935 1726853717.49965: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 32935 1726853717.50143: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 32935 1726853717.50178: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 32935 1726853717.50204: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 32935 1726853717.50247: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 32935 1726853717.50314: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 32935 1726853717.50332: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 32935 1726853717.50354: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853717.50375: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 32935 1726853717.50466: Evaluated conditional (not __network_is_ostree is defined): True 32935 1726853717.50469: variable 'omit' from source: magic vars 32935 1726853717.50494: variable 'omit' from source: magic vars 32935 1726853717.50519: variable 'omit' from source: magic vars 32935 1726853717.50538: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32935 1726853717.50561: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32935 1726853717.50579: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32935 1726853717.50591: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853717.50599: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853717.50620: variable 'inventory_hostname' from source: host vars for 'managed_node1' 32935 1726853717.50624: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853717.50627: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853717.50702: Set connection var ansible_timeout to 10 32935 1726853717.50706: Set connection var ansible_shell_type to sh 32935 1726853717.50714: Set connection var ansible_pipelining to False 32935 1726853717.50716: Set connection var ansible_connection to ssh 32935 1726853717.50721: Set connection var ansible_shell_executable to /bin/sh 32935 1726853717.50726: Set connection var ansible_module_compression to ZIP_DEFLATED 32935 1726853717.50744: variable 'ansible_shell_executable' from source: unknown 32935 1726853717.50747: variable 'ansible_connection' from source: unknown 32935 1726853717.50749: variable 'ansible_module_compression' from source: unknown 32935 1726853717.50752: variable 'ansible_shell_type' from source: unknown 32935 1726853717.50754: variable 'ansible_shell_executable' from source: unknown 32935 1726853717.50756: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853717.50762: variable 'ansible_pipelining' from source: unknown 32935 1726853717.50764: variable 'ansible_timeout' from source: unknown 32935 1726853717.50769: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853717.50866: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 32935 1726853717.50874: variable 'omit' from source: magic vars 32935 1726853717.50882: starting attempt loop 32935 1726853717.50884: running the handler 32935 1726853717.50896: _low_level_execute_command(): starting 32935 1726853717.50903: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32935 1726853717.51406: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853717.51410: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853717.51413: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853717.51414: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853717.51469: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853717.51474: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853717.51478: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853717.51523: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853717.53793: stdout chunk (state=3): >>>/root <<< 32935 1726853717.53948: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853717.53978: stderr chunk (state=3): >>><<< 32935 1726853717.53982: stdout chunk (state=3): >>><<< 32935 1726853717.54008: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853717.54019: _low_level_execute_command(): starting 32935 1726853717.54022: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853717.5400386-33100-185259019179793 `" && echo ansible-tmp-1726853717.5400386-33100-185259019179793="` echo /root/.ansible/tmp/ansible-tmp-1726853717.5400386-33100-185259019179793 `" ) && sleep 0' 32935 1726853717.54439: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853717.54443: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32935 1726853717.54467: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853717.54470: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853717.54483: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853717.54533: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853717.54536: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853717.54588: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853717.57276: stdout chunk (state=3): >>>ansible-tmp-1726853717.5400386-33100-185259019179793=/root/.ansible/tmp/ansible-tmp-1726853717.5400386-33100-185259019179793 <<< 32935 1726853717.57427: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853717.57459: stderr chunk (state=3): >>><<< 32935 1726853717.57466: stdout chunk (state=3): >>><<< 32935 1726853717.57483: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853717.5400386-33100-185259019179793=/root/.ansible/tmp/ansible-tmp-1726853717.5400386-33100-185259019179793 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853717.57524: variable 'ansible_module_compression' from source: unknown 32935 1726853717.57573: ANSIBALLZ: Using lock for stat 32935 1726853717.57576: ANSIBALLZ: Acquiring lock 32935 1726853717.57579: ANSIBALLZ: Lock acquired: 140683295495984 32935 1726853717.57581: ANSIBALLZ: Creating module 32935 1726853717.65085: ANSIBALLZ: Writing module into payload 32935 1726853717.65145: ANSIBALLZ: Writing module 32935 1726853717.65165: ANSIBALLZ: Renaming module 32935 1726853717.65179: ANSIBALLZ: Done creating module 32935 1726853717.65193: variable 'ansible_facts' from source: unknown 32935 1726853717.65236: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853717.5400386-33100-185259019179793/AnsiballZ_stat.py 32935 1726853717.65341: Sending initial data 32935 1726853717.65345: Sent initial data (153 bytes) 32935 1726853717.65805: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853717.65810: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 32935 1726853717.65813: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853717.65816: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853717.65818: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853717.65876: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853717.65879: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853717.65881: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853717.65943: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853717.68232: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 32935 1726853717.68240: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32935 1726853717.68274: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32935 1726853717.68321: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32935vj31k4ae/tmpdanpkt3c /root/.ansible/tmp/ansible-tmp-1726853717.5400386-33100-185259019179793/AnsiballZ_stat.py <<< 32935 1726853717.68324: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853717.5400386-33100-185259019179793/AnsiballZ_stat.py" <<< 32935 1726853717.68360: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-32935vj31k4ae/tmpdanpkt3c" to remote "/root/.ansible/tmp/ansible-tmp-1726853717.5400386-33100-185259019179793/AnsiballZ_stat.py" <<< 32935 1726853717.68364: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853717.5400386-33100-185259019179793/AnsiballZ_stat.py" <<< 32935 1726853717.68934: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853717.68984: stderr chunk (state=3): >>><<< 32935 1726853717.68988: stdout chunk (state=3): >>><<< 32935 1726853717.69011: done transferring module to remote 32935 1726853717.69022: _low_level_execute_command(): starting 32935 1726853717.69027: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853717.5400386-33100-185259019179793/ /root/.ansible/tmp/ansible-tmp-1726853717.5400386-33100-185259019179793/AnsiballZ_stat.py && sleep 0' 32935 1726853717.69481: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853717.69484: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 32935 1726853717.69487: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853717.69489: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration <<< 32935 1726853717.69491: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853717.69493: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853717.69545: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853717.69549: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853717.69553: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853717.69595: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853717.72154: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853717.72183: stderr chunk (state=3): >>><<< 32935 1726853717.72188: stdout chunk (state=3): >>><<< 32935 1726853717.72207: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853717.72210: _low_level_execute_command(): starting 32935 1726853717.72214: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853717.5400386-33100-185259019179793/AnsiballZ_stat.py && sleep 0' 32935 1726853717.72654: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853717.72660: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853717.72662: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853717.72665: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853717.72717: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853717.72724: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853717.72726: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853717.72775: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853717.76008: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 32935 1726853717.76074: stdout chunk (state=3): >>>import _imp # builtin <<< 32935 1726853717.76120: stdout chunk (state=3): >>>import '_thread' # <<< 32935 1726853717.76146: stdout chunk (state=3): >>>import '_warnings' # <<< 32935 1726853717.76154: stdout chunk (state=3): >>> <<< 32935 1726853717.76172: stdout chunk (state=3): >>>import '_weakref' # <<< 32935 1726853717.76277: stdout chunk (state=3): >>>import '_io' # <<< 32935 1726853717.76304: stdout chunk (state=3): >>>import 'marshal' # <<< 32935 1726853717.76364: stdout chunk (state=3): >>> import 'posix' # <<< 32935 1726853717.76369: stdout chunk (state=3): >>> <<< 32935 1726853717.76418: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 32935 1726853717.76427: stdout chunk (state=3): >>> <<< 32935 1726853717.76446: stdout chunk (state=3): >>># installing zipimport hook<<< 32935 1726853717.76449: stdout chunk (state=3): >>> <<< 32935 1726853717.76490: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # <<< 32935 1726853717.76601: stdout chunk (state=3): >>> # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # <<< 32935 1726853717.76639: stdout chunk (state=3): >>>import 'codecs' # <<< 32935 1726853717.76644: stdout chunk (state=3): >>> <<< 32935 1726853717.76705: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py<<< 32935 1726853717.76708: stdout chunk (state=3): >>> <<< 32935 1726853717.76747: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc'<<< 32935 1726853717.76752: stdout chunk (state=3): >>> <<< 32935 1726853717.76779: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461f684d0><<< 32935 1726853717.76789: stdout chunk (state=3): >>> <<< 32935 1726853717.76800: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461f37b30><<< 32935 1726853717.76836: stdout chunk (state=3): >>> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py <<< 32935 1726853717.76856: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc'<<< 32935 1726853717.76878: stdout chunk (state=3): >>> import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461f6aa50><<< 32935 1726853717.76911: stdout chunk (state=3): >>> import '_signal' # <<< 32935 1726853717.76953: stdout chunk (state=3): >>>import '_abc' # <<< 32935 1726853717.76966: stdout chunk (state=3): >>> import 'abc' # <<< 32935 1726853717.76999: stdout chunk (state=3): >>> import 'io' # <<< 32935 1726853717.77053: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 32935 1726853717.77176: stdout chunk (state=3): >>> import '_collections_abc' # <<< 32935 1726853717.77181: stdout chunk (state=3): >>> <<< 32935 1726853717.77220: stdout chunk (state=3): >>>import 'genericpath' # <<< 32935 1726853717.77240: stdout chunk (state=3): >>> <<< 32935 1726853717.77243: stdout chunk (state=3): >>>import 'posixpath' # <<< 32935 1726853717.77245: stdout chunk (state=3): >>> <<< 32935 1726853717.77289: stdout chunk (state=3): >>>import 'os' # <<< 32935 1726853717.77296: stdout chunk (state=3): >>> <<< 32935 1726853717.77320: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 32935 1726853717.77325: stdout chunk (state=3): >>> <<< 32935 1726853717.77349: stdout chunk (state=3): >>>Processing user site-packages<<< 32935 1726853717.77368: stdout chunk (state=3): >>> <<< 32935 1726853717.77374: stdout chunk (state=3): >>>Processing global site-packages<<< 32935 1726853717.77385: stdout chunk (state=3): >>> <<< 32935 1726853717.77397: stdout chunk (state=3): >>>Adding directory: '/usr/lib64/python3.12/site-packages'<<< 32935 1726853717.77412: stdout chunk (state=3): >>> <<< 32935 1726853717.77421: stdout chunk (state=3): >>>Adding directory: '/usr/lib/python3.12/site-packages' <<< 32935 1726853717.77467: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py<<< 32935 1726853717.77474: stdout chunk (state=3): >>> <<< 32935 1726853717.77492: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc'<<< 32935 1726853717.77532: stdout chunk (state=3): >>> import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461d5d130><<< 32935 1726853717.77534: stdout chunk (state=3): >>> <<< 32935 1726853717.77637: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc'<<< 32935 1726853717.77662: stdout chunk (state=3): >>> import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461d5dfa0><<< 32935 1726853717.77711: stdout chunk (state=3): >>> import 'site' # <<< 32935 1726853717.77714: stdout chunk (state=3): >>> <<< 32935 1726853717.77769: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux<<< 32935 1726853717.77774: stdout chunk (state=3): >>> <<< 32935 1726853717.77776: stdout chunk (state=3): >>>Type "help", "copyright", "credits" or "license" for more information.<<< 32935 1726853717.77995: stdout chunk (state=3): >>> <<< 32935 1726853717.78181: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py<<< 32935 1726853717.78183: stdout chunk (state=3): >>> <<< 32935 1726853717.78211: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc'<<< 32935 1726853717.78216: stdout chunk (state=3): >>> <<< 32935 1726853717.78252: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py<<< 32935 1726853717.78259: stdout chunk (state=3): >>> <<< 32935 1726853717.78289: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc'<<< 32935 1726853717.78291: stdout chunk (state=3): >>> <<< 32935 1726853717.78324: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py<<< 32935 1726853717.78330: stdout chunk (state=3): >>> <<< 32935 1726853717.78396: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc'<<< 32935 1726853717.78408: stdout chunk (state=3): >>> <<< 32935 1726853717.78441: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py<<< 32935 1726853717.78446: stdout chunk (state=3): >>> <<< 32935 1726853717.78488: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc'<<< 32935 1726853717.78494: stdout chunk (state=3): >>> <<< 32935 1726853717.78526: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461d9bec0><<< 32935 1726853717.78532: stdout chunk (state=3): >>> <<< 32935 1726853717.78567: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py<<< 32935 1726853717.78576: stdout chunk (state=3): >>> <<< 32935 1726853717.78611: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc'<<< 32935 1726853717.78614: stdout chunk (state=3): >>> <<< 32935 1726853717.78659: stdout chunk (state=3): >>>import '_operator' # <<< 32935 1726853717.78661: stdout chunk (state=3): >>> <<< 32935 1726853717.78676: stdout chunk (state=3): >>>import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461d9bf80><<< 32935 1726853717.78714: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py<<< 32935 1726853717.78718: stdout chunk (state=3): >>> <<< 32935 1726853717.78766: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc'<<< 32935 1726853717.78812: stdout chunk (state=3): >>> # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py<<< 32935 1726853717.78818: stdout chunk (state=3): >>> <<< 32935 1726853717.78896: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc'<<< 32935 1726853717.78935: stdout chunk (state=3): >>> import 'itertools' # <<< 32935 1726853717.78942: stdout chunk (state=3): >>> <<< 32935 1726853717.78977: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py<<< 32935 1726853717.79197: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461dd3830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461dd3ec0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461db3b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461db12b0> <<< 32935 1726853717.79287: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461d99070> <<< 32935 1726853717.79335: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 32935 1726853717.79372: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 32935 1726853717.79400: stdout chunk (state=3): >>>import '_sre' # <<< 32935 1726853717.79440: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 32935 1726853717.79483: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc'<<< 32935 1726853717.79491: stdout chunk (state=3): >>> <<< 32935 1726853717.79518: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py <<< 32935 1726853717.79544: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 32935 1726853717.79601: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461df37d0> <<< 32935 1726853717.79627: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461df23f0><<< 32935 1726853717.79630: stdout chunk (state=3): >>> <<< 32935 1726853717.79679: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py <<< 32935 1726853717.79682: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc'<<< 32935 1726853717.79703: stdout chunk (state=3): >>> <<< 32935 1726853717.79706: stdout chunk (state=3): >>>import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461db2150> <<< 32935 1726853717.79728: stdout chunk (state=3): >>>import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461df0bc0> <<< 32935 1726853717.79800: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py<<< 32935 1726853717.79807: stdout chunk (state=3): >>> <<< 32935 1726853717.79829: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc'<<< 32935 1726853717.79849: stdout chunk (state=3): >>> import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461e28890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461d982f0><<< 32935 1726853717.79888: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py<<< 32935 1726853717.79891: stdout chunk (state=3): >>> <<< 32935 1726853717.79901: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc'<<< 32935 1726853717.79929: stdout chunk (state=3): >>> <<< 32935 1726853717.79959: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so'<<< 32935 1726853717.79980: stdout chunk (state=3): >>> # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so'<<< 32935 1726853717.79997: stdout chunk (state=3): >>> import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe461e28d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461e28bf0><<< 32935 1726853717.80046: stdout chunk (state=3): >>> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 32935 1726853717.80072: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 32935 1726853717.80101: stdout chunk (state=3): >>>import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe461e28fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461d96e10><<< 32935 1726853717.80153: stdout chunk (state=3): >>> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py<<< 32935 1726853717.80162: stdout chunk (state=3): >>> <<< 32935 1726853717.80173: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc'<<< 32935 1726853717.80185: stdout chunk (state=3): >>> <<< 32935 1726853717.80212: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py<<< 32935 1726853717.80261: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 32935 1726853717.80291: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461e29670><<< 32935 1726853717.80294: stdout chunk (state=3): >>> <<< 32935 1726853717.80319: stdout chunk (state=3): >>>import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461e29370> import 'importlib.machinery' # <<< 32935 1726853717.80373: stdout chunk (state=3): >>> # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py<<< 32935 1726853717.80380: stdout chunk (state=3): >>> <<< 32935 1726853717.80383: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc'<<< 32935 1726853717.80404: stdout chunk (state=3): >>> <<< 32935 1726853717.80445: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461e2a540> import 'importlib.util' # <<< 32935 1726853717.80450: stdout chunk (state=3): >>> <<< 32935 1726853717.80502: stdout chunk (state=3): >>>import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py<<< 32935 1726853717.80507: stdout chunk (state=3): >>> <<< 32935 1726853717.80562: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc'<<< 32935 1726853717.80565: stdout chunk (state=3): >>> <<< 32935 1726853717.80608: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461e40740><<< 32935 1726853717.80638: stdout chunk (state=3): >>> import 'errno' # <<< 32935 1726853717.80679: stdout chunk (state=3): >>> # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so'<<< 32935 1726853717.80683: stdout chunk (state=3): >>> <<< 32935 1726853717.80709: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so'<<< 32935 1726853717.80714: stdout chunk (state=3): >>> <<< 32935 1726853717.80754: stdout chunk (state=3): >>>import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe461e41e20> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py<<< 32935 1726853717.80759: stdout chunk (state=3): >>> <<< 32935 1726853717.80781: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc'<<< 32935 1726853717.80822: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py <<< 32935 1726853717.80852: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc'<<< 32935 1726853717.80873: stdout chunk (state=3): >>> import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461e42cc0><<< 32935 1726853717.80922: stdout chunk (state=3): >>> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so'<<< 32935 1726853717.80926: stdout chunk (state=3): >>> <<< 32935 1726853717.80951: stdout chunk (state=3): >>># extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' <<< 32935 1726853717.80980: stdout chunk (state=3): >>>import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe461e432f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461e42210> <<< 32935 1726853717.81029: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc'<<< 32935 1726853717.81080: stdout chunk (state=3): >>> # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so'<<< 32935 1726853717.81085: stdout chunk (state=3): >>> <<< 32935 1726853717.81111: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so'<<< 32935 1726853717.81114: stdout chunk (state=3): >>> import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe461e43d70><<< 32935 1726853717.81146: stdout chunk (state=3): >>> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461e434a0> <<< 32935 1726853717.81239: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461e2a4b0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 32935 1726853717.81303: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 32935 1726853717.81344: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc'<<< 32935 1726853717.81401: stdout chunk (state=3): >>> # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so'<<< 32935 1726853717.81405: stdout chunk (state=3): >>> # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so'<<< 32935 1726853717.81412: stdout chunk (state=3): >>> <<< 32935 1726853717.81414: stdout chunk (state=3): >>>import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe461bbfbf0><<< 32935 1726853717.81455: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py <<< 32935 1726853717.81468: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc'<<< 32935 1726853717.81509: stdout chunk (state=3): >>> # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so'<<< 32935 1726853717.81521: stdout chunk (state=3): >>> # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so'<<< 32935 1726853717.81531: stdout chunk (state=3): >>> <<< 32935 1726853717.81538: stdout chunk (state=3): >>>import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe461be8740> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461be84a0><<< 32935 1726853717.81578: stdout chunk (state=3): >>> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' <<< 32935 1726853717.81599: stdout chunk (state=3): >>># extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so'<<< 32935 1726853717.81601: stdout chunk (state=3): >>> import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe461be8770><<< 32935 1726853717.81643: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py <<< 32935 1726853717.81666: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc'<<< 32935 1726853717.81675: stdout chunk (state=3): >>> <<< 32935 1726853717.81773: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so'<<< 32935 1726853717.81853: stdout chunk (state=3): >>> <<< 32935 1726853717.81981: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so'<<< 32935 1726853717.81988: stdout chunk (state=3): >>> <<< 32935 1726853717.82007: stdout chunk (state=3): >>>import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe461be90a0> <<< 32935 1726853717.82198: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so'<<< 32935 1726853717.82203: stdout chunk (state=3): >>> <<< 32935 1726853717.82227: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so'<<< 32935 1726853717.82232: stdout chunk (state=3): >>> <<< 32935 1726853717.82254: stdout chunk (state=3): >>>import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe461be9a00><<< 32935 1726853717.82258: stdout chunk (state=3): >>> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461be8950><<< 32935 1726853717.82295: stdout chunk (state=3): >>> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461bbddc0> <<< 32935 1726853717.82367: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 32935 1726853717.82398: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py<<< 32935 1726853717.82425: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc'<<< 32935 1726853717.82428: stdout chunk (state=3): >>> <<< 32935 1726853717.82482: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461beade0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461be9b20><<< 32935 1726853717.82486: stdout chunk (state=3): >>> <<< 32935 1726853717.82554: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461e2a6f0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py<<< 32935 1726853717.82559: stdout chunk (state=3): >>> <<< 32935 1726853717.82661: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc'<<< 32935 1726853717.82664: stdout chunk (state=3): >>> <<< 32935 1726853717.82752: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc'<<< 32935 1726853717.82755: stdout chunk (state=3): >>> <<< 32935 1726853717.82807: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461c17140><<< 32935 1726853717.82812: stdout chunk (state=3): >>> <<< 32935 1726853717.82899: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py<<< 32935 1726853717.82902: stdout chunk (state=3): >>> <<< 32935 1726853717.82955: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 32935 1726853717.82996: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc'<<< 32935 1726853717.83063: stdout chunk (state=3): >>> import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461c37530><<< 32935 1726853717.83072: stdout chunk (state=3): >>> <<< 32935 1726853717.83106: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py<<< 32935 1726853717.83115: stdout chunk (state=3): >>> <<< 32935 1726853717.83179: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc'<<< 32935 1726853717.83184: stdout chunk (state=3): >>> <<< 32935 1726853717.83262: stdout chunk (state=3): >>>import 'ntpath' # <<< 32935 1726853717.83304: stdout chunk (state=3): >>> # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py <<< 32935 1726853717.83311: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc'<<< 32935 1726853717.83323: stdout chunk (state=3): >>> import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461c982c0><<< 32935 1726853717.83356: stdout chunk (state=3): >>> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 32935 1726853717.83408: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc'<<< 32935 1726853717.83452: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py<<< 32935 1726853717.83456: stdout chunk (state=3): >>> <<< 32935 1726853717.83516: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc'<<< 32935 1726853717.83521: stdout chunk (state=3): >>> <<< 32935 1726853717.83693: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461c9aa20> <<< 32935 1726853717.83789: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461c983e0><<< 32935 1726853717.83793: stdout chunk (state=3): >>> <<< 32935 1726853717.83857: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461c5d2b0> <<< 32935 1726853717.83911: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461529340> <<< 32935 1726853717.84101: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461c36330> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461bebd10> <<< 32935 1726853717.84117: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 32935 1726853717.84162: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fe461c36930> <<< 32935 1726853717.84347: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_prw9nu5c/ansible_stat_payload.zip'<<< 32935 1726853717.84351: stdout chunk (state=3): >>> <<< 32935 1726853717.84379: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.84609: stdout chunk (state=3): >>># zipimport: zlib available<<< 32935 1726853717.84613: stdout chunk (state=3): >>> <<< 32935 1726853717.84662: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py<<< 32935 1726853717.84665: stdout chunk (state=3): >>> <<< 32935 1726853717.84697: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 32935 1726853717.84764: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py<<< 32935 1726853717.84769: stdout chunk (state=3): >>> <<< 32935 1726853717.84880: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc'<<< 32935 1726853717.84923: stdout chunk (state=3): >>> # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py<<< 32935 1726853717.84930: stdout chunk (state=3): >>> <<< 32935 1726853717.84949: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' <<< 32935 1726853717.84952: stdout chunk (state=3): >>>import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe46157f050><<< 32935 1726853717.84978: stdout chunk (state=3): >>> import '_typing' # <<< 32935 1726853717.85092: stdout chunk (state=3): >>> <<< 32935 1726853717.85249: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe46155df40> <<< 32935 1726853717.85274: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe46155d0d0> <<< 32935 1726853717.85300: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.85340: stdout chunk (state=3): >>>import 'ansible' # <<< 32935 1726853717.85369: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.85405: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.85442: stdout chunk (state=3): >>># zipimport: zlib available<<< 32935 1726853717.85446: stdout chunk (state=3): >>> <<< 32935 1726853717.85474: stdout chunk (state=3): >>>import 'ansible.module_utils' # <<< 32935 1726853717.85510: stdout chunk (state=3): >>># zipimport: zlib available<<< 32935 1726853717.85597: stdout chunk (state=3): >>> <<< 32935 1726853717.87733: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.89580: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py <<< 32935 1726853717.89601: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' <<< 32935 1726853717.89625: stdout chunk (state=3): >>>import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe46157d6d0> <<< 32935 1726853717.89679: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc'<<< 32935 1726853717.89720: stdout chunk (state=3): >>> # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py<<< 32935 1726853717.89726: stdout chunk (state=3): >>> <<< 32935 1726853717.89745: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc'<<< 32935 1726853717.89784: stdout chunk (state=3): >>> # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py <<< 32935 1726853717.89805: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc'<<< 32935 1726853717.89854: stdout chunk (state=3): >>> # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so'<<< 32935 1726853717.89864: stdout chunk (state=3): >>> <<< 32935 1726853717.89868: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so'<<< 32935 1726853717.89887: stdout chunk (state=3): >>> <<< 32935 1726853717.89951: stdout chunk (state=3): >>>import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4615a6990> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4615a6720><<< 32935 1726853717.89957: stdout chunk (state=3): >>> <<< 32935 1726853717.90015: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4615a6030><<< 32935 1726853717.90020: stdout chunk (state=3): >>> <<< 32935 1726853717.90057: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py<<< 32935 1726853717.90062: stdout chunk (state=3): >>> <<< 32935 1726853717.90082: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc'<<< 32935 1726853717.90144: stdout chunk (state=3): >>> import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4615a6a80><<< 32935 1726853717.90151: stdout chunk (state=3): >>> <<< 32935 1726853717.90173: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461be8440> <<< 32935 1726853717.90200: stdout chunk (state=3): >>>import 'atexit' # <<< 32935 1726853717.90246: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so'<<< 32935 1726853717.90261: stdout chunk (state=3): >>> # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4615a76e0><<< 32935 1726853717.90296: stdout chunk (state=3): >>> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so'<<< 32935 1726853717.90301: stdout chunk (state=3): >>> <<< 32935 1726853717.90345: stdout chunk (state=3): >>># extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4615a7920> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 32935 1726853717.90422: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc'<<< 32935 1726853717.90452: stdout chunk (state=3): >>> import '_locale' # <<< 32935 1726853717.90459: stdout chunk (state=3): >>> <<< 32935 1726853717.90517: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4615a7e60><<< 32935 1726853717.90547: stdout chunk (state=3): >>> import 'pwd' # <<< 32935 1726853717.90612: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 32935 1726853717.90675: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461411be0> <<< 32935 1726853717.90714: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so'<<< 32935 1726853717.90760: stdout chunk (state=3): >>> # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe461413800> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py<<< 32935 1726853717.90763: stdout chunk (state=3): >>> <<< 32935 1726853717.90786: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc'<<< 32935 1726853717.90843: stdout chunk (state=3): >>> import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461414200><<< 32935 1726853717.90850: stdout chunk (state=3): >>> <<< 32935 1726853717.90884: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py<<< 32935 1726853717.90889: stdout chunk (state=3): >>> <<< 32935 1726853717.90943: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc'<<< 32935 1726853717.90947: stdout chunk (state=3): >>> <<< 32935 1726853717.90981: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461415100><<< 32935 1726853717.91004: stdout chunk (state=3): >>> <<< 32935 1726853717.91033: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py<<< 32935 1726853717.91098: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc'<<< 32935 1726853717.91104: stdout chunk (state=3): >>> <<< 32935 1726853717.91137: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py<<< 32935 1726853717.91148: stdout chunk (state=3): >>> <<< 32935 1726853717.91164: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc'<<< 32935 1726853717.91170: stdout chunk (state=3): >>> <<< 32935 1726853717.91306: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461417e60> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so'<<< 32935 1726853717.91320: stdout chunk (state=3): >>> # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so'<<< 32935 1726853717.91326: stdout chunk (state=3): >>> <<< 32935 1726853717.91335: stdout chunk (state=3): >>>import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe461417dd0><<< 32935 1726853717.91373: stdout chunk (state=3): >>> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461416120> <<< 32935 1726853717.91413: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py<<< 32935 1726853717.91466: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc'<<< 32935 1726853717.91472: stdout chunk (state=3): >>> <<< 32935 1726853717.91519: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py<<< 32935 1726853717.91522: stdout chunk (state=3): >>> <<< 32935 1726853717.91523: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc'<<< 32935 1726853717.91562: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 32935 1726853717.91612: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc'<<< 32935 1726853717.91648: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py<<< 32935 1726853717.91659: stdout chunk (state=3): >>> <<< 32935 1726853717.91670: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc'<<< 32935 1726853717.91691: stdout chunk (state=3): >>> import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe46141fe30><<< 32935 1726853717.91720: stdout chunk (state=3): >>> import '_tokenize' # <<< 32935 1726853717.91725: stdout chunk (state=3): >>> <<< 32935 1726853717.91832: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe46141e900> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe46141e660> <<< 32935 1726853717.91898: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 32935 1726853717.92006: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe46141ebd0><<< 32935 1726853717.92058: stdout chunk (state=3): >>> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461416630><<< 32935 1726853717.92065: stdout chunk (state=3): >>> <<< 32935 1726853717.92103: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' <<< 32935 1726853717.92123: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so'<<< 32935 1726853717.92175: stdout chunk (state=3): >>> import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe461467f20> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py<<< 32935 1726853717.92182: stdout chunk (state=3): >>> <<< 32935 1726853717.92187: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' <<< 32935 1726853717.92233: stdout chunk (state=3): >>>import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4614681d0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py<<< 32935 1726853717.92238: stdout chunk (state=3): >>> <<< 32935 1726853717.92266: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc'<<< 32935 1726853717.92302: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py <<< 32935 1726853717.92310: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc'<<< 32935 1726853717.92367: stdout chunk (state=3): >>> # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so'<<< 32935 1726853717.92370: stdout chunk (state=3): >>> # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so'<<< 32935 1726853717.92399: stdout chunk (state=3): >>> import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe461469c40><<< 32935 1726853717.92407: stdout chunk (state=3): >>> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461469a00><<< 32935 1726853717.92438: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py<<< 32935 1726853717.92443: stdout chunk (state=3): >>> <<< 32935 1726853717.92610: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc'<<< 32935 1726853717.92669: stdout chunk (state=3): >>> # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 32935 1726853717.92696: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 32935 1726853717.92709: stdout chunk (state=3): >>>import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe46146c200><<< 32935 1726853717.92719: stdout chunk (state=3): >>> <<< 32935 1726853717.92729: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe46146a330><<< 32935 1726853717.92768: stdout chunk (state=3): >>> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py<<< 32935 1726853717.92775: stdout chunk (state=3): >>> <<< 32935 1726853717.92844: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc'<<< 32935 1726853717.92850: stdout chunk (state=3): >>> <<< 32935 1726853717.92894: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc'<<< 32935 1726853717.92921: stdout chunk (state=3): >>> import '_string' # <<< 32935 1726853717.92998: stdout chunk (state=3): >>> import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe46146f9e0><<< 32935 1726853717.93003: stdout chunk (state=3): >>> <<< 32935 1726853717.93216: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe46146c3b0><<< 32935 1726853717.93222: stdout chunk (state=3): >>> <<< 32935 1726853717.93308: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so'<<< 32935 1726853717.93311: stdout chunk (state=3): >>> <<< 32935 1726853717.93331: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so'<<< 32935 1726853717.93333: stdout chunk (state=3): >>> <<< 32935 1726853717.93389: stdout chunk (state=3): >>>import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe461470a70> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so'<<< 32935 1726853717.93402: stdout chunk (state=3): >>> # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so'<<< 32935 1726853717.93408: stdout chunk (state=3): >>> import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe461470a10><<< 32935 1726853717.93501: stdout chunk (state=3): >>> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe461470b90> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461468320> <<< 32935 1726853717.93530: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py<<< 32935 1726853717.93541: stdout chunk (state=3): >>> <<< 32935 1726853717.93554: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc'<<< 32935 1726853717.93562: stdout chunk (state=3): >>> <<< 32935 1726853717.93598: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py<<< 32935 1726853717.93604: stdout chunk (state=3): >>> <<< 32935 1726853717.93647: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc'<<< 32935 1726853717.93652: stdout chunk (state=3): >>> <<< 32935 1726853717.93695: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so'<<< 32935 1726853717.93740: stdout chunk (state=3): >>> # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 32935 1726853717.93755: stdout chunk (state=3): >>>import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4614fc2f0><<< 32935 1726853717.93895: stdout chunk (state=3): >>> <<< 32935 1726853717.94011: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 32935 1726853717.94051: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so'<<< 32935 1726853717.94054: stdout chunk (state=3): >>> <<< 32935 1726853717.94057: stdout chunk (state=3): >>>import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4614fd6a0> <<< 32935 1726853717.94084: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461472ab0> <<< 32935 1726853717.94134: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' <<< 32935 1726853717.94154: stdout chunk (state=3): >>># extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' <<< 32935 1726853717.94168: stdout chunk (state=3): >>>import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe461473e30> <<< 32935 1726853717.94182: stdout chunk (state=3): >>>import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461472690><<< 32935 1726853717.94188: stdout chunk (state=3): >>> <<< 32935 1726853717.94211: stdout chunk (state=3): >>># zipimport: zlib available<<< 32935 1726853717.94216: stdout chunk (state=3): >>> <<< 32935 1726853717.94241: stdout chunk (state=3): >>># zipimport: zlib available<<< 32935 1726853717.94256: stdout chunk (state=3): >>> import 'ansible.module_utils.compat' # <<< 32935 1726853717.94284: stdout chunk (state=3): >>> # zipimport: zlib available<<< 32935 1726853717.94289: stdout chunk (state=3): >>> <<< 32935 1726853717.94419: stdout chunk (state=3): >>># zipimport: zlib available<<< 32935 1726853717.94555: stdout chunk (state=3): >>> # zipimport: zlib available <<< 32935 1726853717.94586: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.94612: stdout chunk (state=3): >>>import 'ansible.module_utils.common' # <<< 32935 1726853717.94647: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.94681: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.94706: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text' # <<< 32935 1726853717.94742: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.94953: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.95154: stdout chunk (state=3): >>># zipimport: zlib available<<< 32935 1726853717.95292: stdout chunk (state=3): >>> <<< 32935 1726853717.96084: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.97002: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 32935 1726853717.97030: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # <<< 32935 1726853717.97059: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # <<< 32935 1726853717.97083: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.converters' # <<< 32935 1726853717.97124: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 32935 1726853717.97167: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc'<<< 32935 1726853717.97173: stdout chunk (state=3): >>> <<< 32935 1726853717.97251: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' <<< 32935 1726853717.97259: stdout chunk (state=3): >>># extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so'<<< 32935 1726853717.97275: stdout chunk (state=3): >>> import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe461301850><<< 32935 1726853717.97398: stdout chunk (state=3): >>> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py<<< 32935 1726853717.97402: stdout chunk (state=3): >>> <<< 32935 1726853717.97425: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 32935 1726853717.97466: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461302600> <<< 32935 1726853717.97499: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461472660> <<< 32935 1726853717.97572: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 32935 1726853717.97604: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.97646: stdout chunk (state=3): >>># zipimport: zlib available<<< 32935 1726853717.97651: stdout chunk (state=3): >>> <<< 32935 1726853717.97680: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # <<< 32935 1726853717.97898: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.97943: stdout chunk (state=3): >>># zipimport: zlib available<<< 32935 1726853717.97948: stdout chunk (state=3): >>> <<< 32935 1726853717.98184: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py<<< 32935 1726853717.98196: stdout chunk (state=3): >>> <<< 32935 1726853717.98212: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc'<<< 32935 1726853717.98219: stdout chunk (state=3): >>> <<< 32935 1726853717.98235: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4613023c0><<< 32935 1726853717.98241: stdout chunk (state=3): >>> <<< 32935 1726853717.98263: stdout chunk (state=3): >>># zipimport: zlib available<<< 32935 1726853717.98268: stdout chunk (state=3): >>> <<< 32935 1726853717.99053: stdout chunk (state=3): >>># zipimport: zlib available<<< 32935 1726853717.99068: stdout chunk (state=3): >>> <<< 32935 1726853717.99783: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853717.99905: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853718.00022: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 32935 1726853718.00207: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 32935 1726853718.00275: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853718.00413: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 32935 1726853718.00624: stdout chunk (state=3): >>># zipimport: zlib available<<< 32935 1726853718.00629: stdout chunk (state=3): >>> # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # <<< 32935 1726853718.00663: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853718.01069: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853718.01499: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 32935 1726853718.01551: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 32935 1726853718.01594: stdout chunk (state=3): >>>import '_ast' # <<< 32935 1726853718.01606: stdout chunk (state=3): >>> <<< 32935 1726853718.01705: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4613038f0> <<< 32935 1726853718.01734: stdout chunk (state=3): >>># zipimport: zlib available<<< 32935 1726853718.01848: stdout chunk (state=3): >>> # zipimport: zlib available<<< 32935 1726853718.01853: stdout chunk (state=3): >>> <<< 32935 1726853718.01960: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 32935 1726853718.01981: stdout chunk (state=3): >>> <<< 32935 1726853718.01984: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # <<< 32935 1726853718.01998: stdout chunk (state=3): >>> <<< 32935 1726853718.02002: stdout chunk (state=3): >>>import 'ansible.module_utils.common.parameters' # <<< 32935 1726853718.02033: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 32935 1726853718.02038: stdout chunk (state=3): >>> <<< 32935 1726853718.02065: stdout chunk (state=3): >>># zipimport: zlib available<<< 32935 1726853718.02072: stdout chunk (state=3): >>> <<< 32935 1726853718.02144: stdout chunk (state=3): >>># zipimport: zlib available<<< 32935 1726853718.02146: stdout chunk (state=3): >>> <<< 32935 1726853718.02201: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 32935 1726853718.02206: stdout chunk (state=3): >>> <<< 32935 1726853718.02224: stdout chunk (state=3): >>># zipimport: zlib available<<< 32935 1726853718.02295: stdout chunk (state=3): >>> # zipimport: zlib available <<< 32935 1726853718.02376: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853718.02481: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853718.02703: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py<<< 32935 1726853718.02782: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc'<<< 32935 1726853718.02785: stdout chunk (state=3): >>> <<< 32935 1726853718.02957: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so'<<< 32935 1726853718.02969: stdout chunk (state=3): >>> <<< 32935 1726853718.02979: stdout chunk (state=3): >>># extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so'<<< 32935 1726853718.02995: stdout chunk (state=3): >>> <<< 32935 1726853718.03020: stdout chunk (state=3): >>>import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe46130e270><<< 32935 1726853718.03084: stdout chunk (state=3): >>> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4613091c0><<< 32935 1726853718.03089: stdout chunk (state=3): >>> <<< 32935 1726853718.03135: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # <<< 32935 1726853718.03144: stdout chunk (state=3): >>> <<< 32935 1726853718.03161: stdout chunk (state=3): >>>import 'ansible.module_utils.common.process' # <<< 32935 1726853718.03191: stdout chunk (state=3): >>># zipimport: zlib available<<< 32935 1726853718.03196: stdout chunk (state=3): >>> <<< 32935 1726853718.03384: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853718.03492: stdout chunk (state=3): >>># zipimport: zlib available<<< 32935 1726853718.03563: stdout chunk (state=3): >>> # zipimport: zlib available<<< 32935 1726853718.03569: stdout chunk (state=3): >>> <<< 32935 1726853718.03639: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py<<< 32935 1726853718.03643: stdout chunk (state=3): >>> <<< 32935 1726853718.03660: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc'<<< 32935 1726853718.03694: stdout chunk (state=3): >>> # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py<<< 32935 1726853718.03698: stdout chunk (state=3): >>> <<< 32935 1726853718.03740: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc'<<< 32935 1726853718.03775: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py<<< 32935 1726853718.03782: stdout chunk (state=3): >>> <<< 32935 1726853718.03880: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 32935 1726853718.03996: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 32935 1726853718.04021: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4615feb40><<< 32935 1726853718.04026: stdout chunk (state=3): >>> <<< 32935 1726853718.04099: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4615ee810> <<< 32935 1726853718.04212: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe46130e1e0> <<< 32935 1726853718.04243: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461471970> <<< 32935 1726853718.04250: stdout chunk (state=3): >>># destroy ansible.module_utils.distro<<< 32935 1726853718.04267: stdout chunk (state=3): >>> <<< 32935 1726853718.04274: stdout chunk (state=3): >>>import 'ansible.module_utils.distro' # <<< 32935 1726853718.04301: stdout chunk (state=3): >>># zipimport: zlib available<<< 32935 1726853718.04346: stdout chunk (state=3): >>> # zipimport: zlib available<<< 32935 1726853718.04351: stdout chunk (state=3): >>> <<< 32935 1726853718.04387: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # <<< 32935 1726853718.04403: stdout chunk (state=3): >>> <<< 32935 1726853718.04407: stdout chunk (state=3): >>>import 'ansible.module_utils.common.sys_info' # <<< 32935 1726853718.04423: stdout chunk (state=3): >>> <<< 32935 1726853718.04496: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 32935 1726853718.04530: stdout chunk (state=3): >>> # zipimport: zlib available<<< 32935 1726853718.04533: stdout chunk (state=3): >>> <<< 32935 1726853718.04564: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # <<< 32935 1726853718.04588: stdout chunk (state=3): >>> # zipimport: zlib available<<< 32935 1726853718.04606: stdout chunk (state=3): >>> <<< 32935 1726853718.04994: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853718.05137: stdout chunk (state=3): >>># zipimport: zlib available <<< 32935 1726853718.05336: stdout chunk (state=3): >>> <<< 32935 1726853718.05350: stdout chunk (state=3): >>>{"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}}<<< 32935 1726853718.05359: stdout chunk (state=3): >>> <<< 32935 1726853718.05395: stdout chunk (state=3): >>># destroy __main__<<< 32935 1726853718.05402: stdout chunk (state=3): >>> <<< 32935 1726853718.05901: stdout chunk (state=3): >>># clear sys.path_importer_cache<<< 32935 1726853718.05908: stdout chunk (state=3): >>> <<< 32935 1726853718.05917: stdout chunk (state=3): >>># clear sys.path_hooks<<< 32935 1726853718.05935: stdout chunk (state=3): >>> # clear builtins._<<< 32935 1726853718.05951: stdout chunk (state=3): >>> # clear sys.path # clear sys.argv<<< 32935 1726853718.05977: stdout chunk (state=3): >>> # clear sys.ps1<<< 32935 1726853718.05980: stdout chunk (state=3): >>> # clear sys.ps2 # clear sys.last_exc<<< 32935 1726853718.06008: stdout chunk (state=3): >>> # clear sys.last_type<<< 32935 1726853718.06019: stdout chunk (state=3): >>> # clear sys.last_value <<< 32935 1726853718.06034: stdout chunk (state=3): >>># clear sys.last_traceback<<< 32935 1726853718.06047: stdout chunk (state=3): >>> # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin<<< 32935 1726853718.06067: stdout chunk (state=3): >>> # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport<<< 32935 1726853718.06094: stdout chunk (state=3): >>> # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8<<< 32935 1726853718.06097: stdout chunk (state=3): >>> # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc<<< 32935 1726853718.06120: stdout chunk (state=3): >>> # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack <<< 32935 1726853718.06143: stdout chunk (state=3): >>># destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib<<< 32935 1726853718.06170: stdout chunk (state=3): >>> # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii<<< 32935 1726853718.06188: stdout chunk (state=3): >>> # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno<<< 32935 1726853718.06217: stdout chunk (state=3): >>> # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib <<< 32935 1726853718.06239: stdout chunk (state=3): >>># cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress<<< 32935 1726853718.06264: stdout chunk (state=3): >>> # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing<<< 32935 1726853718.06277: stdout chunk (state=3): >>> # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner<<< 32935 1726853718.06302: stdout chunk (state=3): >>> # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors<<< 32935 1726853718.06324: stdout chunk (state=3): >>> # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback<<< 32935 1726853718.06346: stdout chunk (state=3): >>> # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string<<< 32935 1726853718.06375: stdout chunk (state=3): >>> # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat<<< 32935 1726853718.06396: stdout chunk (state=3): >>> # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters<<< 32935 1726853718.06417: stdout chunk (state=3): >>> # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings<<< 32935 1726853718.06437: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast<<< 32935 1726853718.06476: stdout chunk (state=3): >>> # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec<<< 32935 1726853718.06503: stdout chunk (state=3): >>> # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info<<< 32935 1726853718.06697: stdout chunk (state=3): >>> # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 32935 1726853718.06884: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 32935 1726853718.06920: stdout chunk (state=3): >>># destroy importlib.machinery <<< 32935 1726853718.06928: stdout chunk (state=3): >>># destroy importlib._abc<<< 32935 1726853718.06944: stdout chunk (state=3): >>> <<< 32935 1726853718.06949: stdout chunk (state=3): >>># destroy importlib.util <<< 32935 1726853718.06979: stdout chunk (state=3): >>># destroy _bz2 <<< 32935 1726853718.07004: stdout chunk (state=3): >>># destroy _compression <<< 32935 1726853718.07028: stdout chunk (state=3): >>># destroy _lzma <<< 32935 1726853718.07044: stdout chunk (state=3): >>># destroy _blake2<<< 32935 1726853718.07050: stdout chunk (state=3): >>> <<< 32935 1726853718.07075: stdout chunk (state=3): >>># destroy binascii<<< 32935 1726853718.07079: stdout chunk (state=3): >>> # destroy struct <<< 32935 1726853718.07101: stdout chunk (state=3): >>># destroy zlib <<< 32935 1726853718.07125: stdout chunk (state=3): >>># destroy bz2<<< 32935 1726853718.07131: stdout chunk (state=3): >>> # destroy lzma <<< 32935 1726853718.07159: stdout chunk (state=3): >>># destroy zipfile._path # destroy zipfile<<< 32935 1726853718.07172: stdout chunk (state=3): >>> # destroy pathlib # destroy zipfile._path.glob<<< 32935 1726853718.07218: stdout chunk (state=3): >>> # destroy fnmatch # destroy ipaddress # destroy ntpath<<< 32935 1726853718.07223: stdout chunk (state=3): >>> <<< 32935 1726853718.07248: stdout chunk (state=3): >>># destroy importlib<<< 32935 1726853718.07255: stdout chunk (state=3): >>> <<< 32935 1726853718.07268: stdout chunk (state=3): >>># destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal<<< 32935 1726853718.07400: stdout chunk (state=3): >>> # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess <<< 32935 1726853718.07461: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux<<< 32935 1726853718.07464: stdout chunk (state=3): >>> # cleanup[3] wiping ctypes._endian<<< 32935 1726853718.07490: stdout chunk (state=3): >>> # cleanup[3] wiping _ctypes<<< 32935 1726853718.07508: stdout chunk (state=3): >>> # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc<<< 32935 1726853718.07515: stdout chunk (state=3): >>> <<< 32935 1726853718.07533: stdout chunk (state=3): >>># cleanup[3] wiping ansible.module_utils.six.moves<<< 32935 1726853718.07545: stdout chunk (state=3): >>> # cleanup[3] wiping systemd._daemon<<< 32935 1726853718.07551: stdout chunk (state=3): >>> <<< 32935 1726853718.07574: stdout chunk (state=3): >>># cleanup[3] wiping _socket <<< 32935 1726853718.07598: stdout chunk (state=3): >>># cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache<<< 32935 1726853718.07619: stdout chunk (state=3): >>> # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize<<< 32935 1726853718.07640: stdout chunk (state=3): >>> # cleanup[3] wiping platform # cleanup[3] wiping atexit<<< 32935 1726853718.07656: stdout chunk (state=3): >>> # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib<<< 32935 1726853718.07676: stdout chunk (state=3): >>> # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random<<< 32935 1726853718.07702: stdout chunk (state=3): >>> # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct <<< 32935 1726853718.07727: stdout chunk (state=3): >>># cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser<<< 32935 1726853718.07750: stdout chunk (state=3): >>> # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools<<< 32935 1726853718.07773: stdout chunk (state=3): >>> # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator<<< 32935 1726853718.07796: stdout chunk (state=3): >>> # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath<<< 32935 1726853718.07810: stdout chunk (state=3): >>> # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io<<< 32935 1726853718.07834: stdout chunk (state=3): >>> # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs<<< 32935 1726853718.07860: stdout chunk (state=3): >>> # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref<<< 32935 1726853718.07892: stdout chunk (state=3): >>> # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins<<< 32935 1726853718.07907: stdout chunk (state=3): >>> # destroy selinux._selinux # destroy systemd._daemon<<< 32935 1726853718.08100: stdout chunk (state=3): >>> # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 32935 1726853718.08126: stdout chunk (state=3): >>># destroy sys.monitoring<<< 32935 1726853718.08129: stdout chunk (state=3): >>> <<< 32935 1726853718.08132: stdout chunk (state=3): >>># destroy _socket<<< 32935 1726853718.08151: stdout chunk (state=3): >>> <<< 32935 1726853718.08169: stdout chunk (state=3): >>># destroy _collections<<< 32935 1726853718.08176: stdout chunk (state=3): >>> <<< 32935 1726853718.08210: stdout chunk (state=3): >>># destroy platform<<< 32935 1726853718.08229: stdout chunk (state=3): >>> <<< 32935 1726853718.08253: stdout chunk (state=3): >>># destroy _uuid <<< 32935 1726853718.08256: stdout chunk (state=3): >>># destroy stat # destroy genericpath<<< 32935 1726853718.08265: stdout chunk (state=3): >>> # destroy re._parser<<< 32935 1726853718.08277: stdout chunk (state=3): >>> # destroy tokenize<<< 32935 1726853718.08289: stdout chunk (state=3): >>> <<< 32935 1726853718.08318: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib<<< 32935 1726853718.08336: stdout chunk (state=3): >>> <<< 32935 1726853718.08349: stdout chunk (state=3): >>># destroy copyreg # destroy contextlib <<< 32935 1726853718.08388: stdout chunk (state=3): >>># destroy _typing<<< 32935 1726853718.08409: stdout chunk (state=3): >>> <<< 32935 1726853718.08412: stdout chunk (state=3): >>># destroy _tokenize <<< 32935 1726853718.08425: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib_parse<<< 32935 1726853718.08428: stdout chunk (state=3): >>> # destroy ansible.module_utils.six.moves.urllib.error<<< 32935 1726853718.08450: stdout chunk (state=3): >>> # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves<<< 32935 1726853718.08474: stdout chunk (state=3): >>> # destroy _frozen_importlib_external # destroy _imp<<< 32935 1726853718.08534: stdout chunk (state=3): >>> # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 32935 1726853718.08646: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time <<< 32935 1726853718.08732: stdout chunk (state=3): >>># destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re <<< 32935 1726853718.08752: stdout chunk (state=3): >>># destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread <<< 32935 1726853718.08895: stdout chunk (state=3): >>># clear sys.audit hooks <<< 32935 1726853718.09182: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 32935 1726853718.09213: stderr chunk (state=3): >>><<< 32935 1726853718.09217: stdout chunk (state=3): >>><<< 32935 1726853718.09351: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461f684d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461f37b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461f6aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461d5d130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461d5dfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461d9bec0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461d9bf80> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461dd3830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461dd3ec0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461db3b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461db12b0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461d99070> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461df37d0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461df23f0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461db2150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461df0bc0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461e28890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461d982f0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe461e28d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461e28bf0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe461e28fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461d96e10> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461e29670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461e29370> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461e2a540> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461e40740> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe461e41e20> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461e42cc0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe461e432f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461e42210> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe461e43d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461e434a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461e2a4b0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe461bbfbf0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe461be8740> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461be84a0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe461be8770> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe461be90a0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe461be9a00> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461be8950> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461bbddc0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461beade0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461be9b20> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461e2a6f0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461c17140> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461c37530> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461c982c0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461c9aa20> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461c983e0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461c5d2b0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461529340> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461c36330> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461bebd10> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fe461c36930> # zipimport: found 30 names in '/tmp/ansible_stat_payload_prw9nu5c/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe46157f050> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe46155df40> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe46155d0d0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe46157d6d0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4615a6990> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4615a6720> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4615a6030> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4615a6a80> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461be8440> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4615a76e0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4615a7920> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4615a7e60> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461411be0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe461413800> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461414200> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461415100> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461417e60> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe461417dd0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461416120> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe46141fe30> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe46141e900> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe46141e660> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe46141ebd0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461416630> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe461467f20> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4614681d0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe461469c40> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461469a00> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe46146c200> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe46146a330> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe46146f9e0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe46146c3b0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe461470a70> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe461470a10> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe461470b90> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461468320> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4614fc2f0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4614fd6a0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461472ab0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe461473e30> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461472690> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe461301850> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461302600> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461472660> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4613023c0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4613038f0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe46130e270> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4613091c0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4615feb40> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4615ee810> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe46130e1e0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe461471970> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 32935 1726853718.10342: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853717.5400386-33100-185259019179793/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32935 1726853718.10345: _low_level_execute_command(): starting 32935 1726853718.10348: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853717.5400386-33100-185259019179793/ > /dev/null 2>&1 && sleep 0' 32935 1726853718.10350: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32935 1726853718.10352: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853718.10354: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853718.10356: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853718.10358: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32935 1726853718.10360: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853718.10475: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853718.10489: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853718.10554: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853718.13113: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853718.13138: stderr chunk (state=3): >>><<< 32935 1726853718.13142: stdout chunk (state=3): >>><<< 32935 1726853718.13155: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853718.13164: handler run complete 32935 1726853718.13182: attempt loop complete, returning result 32935 1726853718.13185: _execute() done 32935 1726853718.13190: dumping result to json 32935 1726853718.13193: done dumping result, returning 32935 1726853718.13200: done running TaskExecutor() for managed_node1/TASK: Check if system is ostree [02083763-bbaf-84df-441d-0000000000c2] 32935 1726853718.13203: sending task result for task 02083763-bbaf-84df-441d-0000000000c2 32935 1726853718.13291: done sending task result for task 02083763-bbaf-84df-441d-0000000000c2 32935 1726853718.13293: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 32935 1726853718.13357: no more pending results, returning what we have 32935 1726853718.13359: results queue empty 32935 1726853718.13360: checking for any_errors_fatal 32935 1726853718.13366: done checking for any_errors_fatal 32935 1726853718.13367: checking for max_fail_percentage 32935 1726853718.13368: done checking for max_fail_percentage 32935 1726853718.13369: checking to see if all hosts have failed and the running result is not ok 32935 1726853718.13370: done checking to see if all hosts have failed 32935 1726853718.13372: getting the remaining hosts for this loop 32935 1726853718.13374: done getting the remaining hosts for this loop 32935 1726853718.13377: getting the next task for host managed_node1 32935 1726853718.13383: done getting next task for host managed_node1 32935 1726853718.13385: ^ task is: TASK: Set flag to indicate system is ostree 32935 1726853718.13388: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853718.13392: getting variables 32935 1726853718.13394: in VariableManager get_vars() 32935 1726853718.13423: Calling all_inventory to load vars for managed_node1 32935 1726853718.13425: Calling groups_inventory to load vars for managed_node1 32935 1726853718.13428: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853718.13439: Calling all_plugins_play to load vars for managed_node1 32935 1726853718.13442: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853718.13444: Calling groups_plugins_play to load vars for managed_node1 32935 1726853718.13605: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853718.13720: done with get_vars() 32935 1726853718.13728: done getting variables 32935 1726853718.13800: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Friday 20 September 2024 13:35:18 -0400 (0:00:00.647) 0:00:03.273 ****** 32935 1726853718.13822: entering _queue_task() for managed_node1/set_fact 32935 1726853718.13823: Creating lock for set_fact 32935 1726853718.14030: worker is 1 (out of 1 available) 32935 1726853718.14042: exiting _queue_task() for managed_node1/set_fact 32935 1726853718.14054: done queuing things up, now waiting for results queue to drain 32935 1726853718.14056: waiting for pending results... 32935 1726853718.14198: running TaskExecutor() for managed_node1/TASK: Set flag to indicate system is ostree 32935 1726853718.14286: in run() - task 02083763-bbaf-84df-441d-0000000000c3 32935 1726853718.14290: variable 'ansible_search_path' from source: unknown 32935 1726853718.14293: variable 'ansible_search_path' from source: unknown 32935 1726853718.14385: calling self._execute() 32935 1726853718.14388: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853718.14409: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853718.14413: variable 'omit' from source: magic vars 32935 1726853718.14843: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 32935 1726853718.15479: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 32935 1726853718.15483: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 32935 1726853718.15485: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 32935 1726853718.15488: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 32935 1726853718.15491: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 32935 1726853718.15493: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 32935 1726853718.15496: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853718.15498: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 32935 1726853718.15500: Evaluated conditional (not __network_is_ostree is defined): True 32935 1726853718.15502: variable 'omit' from source: magic vars 32935 1726853718.15527: variable 'omit' from source: magic vars 32935 1726853718.15781: variable '__ostree_booted_stat' from source: set_fact 32935 1726853718.15784: variable 'omit' from source: magic vars 32935 1726853718.15787: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32935 1726853718.15790: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32935 1726853718.15792: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32935 1726853718.15794: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853718.15796: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853718.15808: variable 'inventory_hostname' from source: host vars for 'managed_node1' 32935 1726853718.15811: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853718.15813: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853718.15894: Set connection var ansible_timeout to 10 32935 1726853718.15900: Set connection var ansible_shell_type to sh 32935 1726853718.15909: Set connection var ansible_pipelining to False 32935 1726853718.15911: Set connection var ansible_connection to ssh 32935 1726853718.15916: Set connection var ansible_shell_executable to /bin/sh 32935 1726853718.15921: Set connection var ansible_module_compression to ZIP_DEFLATED 32935 1726853718.15947: variable 'ansible_shell_executable' from source: unknown 32935 1726853718.15951: variable 'ansible_connection' from source: unknown 32935 1726853718.15953: variable 'ansible_module_compression' from source: unknown 32935 1726853718.15955: variable 'ansible_shell_type' from source: unknown 32935 1726853718.15960: variable 'ansible_shell_executable' from source: unknown 32935 1726853718.15963: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853718.15965: variable 'ansible_pipelining' from source: unknown 32935 1726853718.15968: variable 'ansible_timeout' from source: unknown 32935 1726853718.15970: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853718.16061: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32935 1726853718.16065: variable 'omit' from source: magic vars 32935 1726853718.16073: starting attempt loop 32935 1726853718.16077: running the handler 32935 1726853718.16086: handler run complete 32935 1726853718.16095: attempt loop complete, returning result 32935 1726853718.16104: _execute() done 32935 1726853718.16108: dumping result to json 32935 1726853718.16110: done dumping result, returning 32935 1726853718.16112: done running TaskExecutor() for managed_node1/TASK: Set flag to indicate system is ostree [02083763-bbaf-84df-441d-0000000000c3] 32935 1726853718.16114: sending task result for task 02083763-bbaf-84df-441d-0000000000c3 32935 1726853718.16192: done sending task result for task 02083763-bbaf-84df-441d-0000000000c3 32935 1726853718.16195: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 32935 1726853718.16283: no more pending results, returning what we have 32935 1726853718.16285: results queue empty 32935 1726853718.16286: checking for any_errors_fatal 32935 1726853718.16292: done checking for any_errors_fatal 32935 1726853718.16292: checking for max_fail_percentage 32935 1726853718.16294: done checking for max_fail_percentage 32935 1726853718.16294: checking to see if all hosts have failed and the running result is not ok 32935 1726853718.16295: done checking to see if all hosts have failed 32935 1726853718.16296: getting the remaining hosts for this loop 32935 1726853718.16297: done getting the remaining hosts for this loop 32935 1726853718.16300: getting the next task for host managed_node1 32935 1726853718.16308: done getting next task for host managed_node1 32935 1726853718.16311: ^ task is: TASK: Fix CentOS6 Base repo 32935 1726853718.16313: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853718.16323: getting variables 32935 1726853718.16324: in VariableManager get_vars() 32935 1726853718.16348: Calling all_inventory to load vars for managed_node1 32935 1726853718.16351: Calling groups_inventory to load vars for managed_node1 32935 1726853718.16354: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853718.16365: Calling all_plugins_play to load vars for managed_node1 32935 1726853718.16367: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853718.16377: Calling groups_plugins_play to load vars for managed_node1 32935 1726853718.16600: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853718.16791: done with get_vars() 32935 1726853718.16801: done getting variables 32935 1726853718.16913: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Friday 20 September 2024 13:35:18 -0400 (0:00:00.031) 0:00:03.305 ****** 32935 1726853718.16940: entering _queue_task() for managed_node1/copy 32935 1726853718.17395: worker is 1 (out of 1 available) 32935 1726853718.17403: exiting _queue_task() for managed_node1/copy 32935 1726853718.17412: done queuing things up, now waiting for results queue to drain 32935 1726853718.17414: waiting for pending results... 32935 1726853718.17539: running TaskExecutor() for managed_node1/TASK: Fix CentOS6 Base repo 32935 1726853718.17634: in run() - task 02083763-bbaf-84df-441d-0000000000c5 32935 1726853718.17639: variable 'ansible_search_path' from source: unknown 32935 1726853718.17643: variable 'ansible_search_path' from source: unknown 32935 1726853718.17646: calling self._execute() 32935 1726853718.17674: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853718.17679: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853718.17690: variable 'omit' from source: magic vars 32935 1726853718.18247: variable 'ansible_distribution' from source: facts 32935 1726853718.18290: Evaluated conditional (ansible_distribution == 'CentOS'): True 32935 1726853718.18431: variable 'ansible_distribution_major_version' from source: facts 32935 1726853718.18443: Evaluated conditional (ansible_distribution_major_version == '6'): False 32935 1726853718.18451: when evaluation is False, skipping this task 32935 1726853718.18508: _execute() done 32935 1726853718.18512: dumping result to json 32935 1726853718.18519: done dumping result, returning 32935 1726853718.18522: done running TaskExecutor() for managed_node1/TASK: Fix CentOS6 Base repo [02083763-bbaf-84df-441d-0000000000c5] 32935 1726853718.18524: sending task result for task 02083763-bbaf-84df-441d-0000000000c5 32935 1726853718.18684: done sending task result for task 02083763-bbaf-84df-441d-0000000000c5 32935 1726853718.18689: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 32935 1726853718.18750: no more pending results, returning what we have 32935 1726853718.18753: results queue empty 32935 1726853718.18754: checking for any_errors_fatal 32935 1726853718.18758: done checking for any_errors_fatal 32935 1726853718.18758: checking for max_fail_percentage 32935 1726853718.18760: done checking for max_fail_percentage 32935 1726853718.18761: checking to see if all hosts have failed and the running result is not ok 32935 1726853718.18762: done checking to see if all hosts have failed 32935 1726853718.18763: getting the remaining hosts for this loop 32935 1726853718.18764: done getting the remaining hosts for this loop 32935 1726853718.18767: getting the next task for host managed_node1 32935 1726853718.18776: done getting next task for host managed_node1 32935 1726853718.18778: ^ task is: TASK: Include the task 'enable_epel.yml' 32935 1726853718.18781: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853718.18786: getting variables 32935 1726853718.18788: in VariableManager get_vars() 32935 1726853718.18816: Calling all_inventory to load vars for managed_node1 32935 1726853718.18818: Calling groups_inventory to load vars for managed_node1 32935 1726853718.18822: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853718.18833: Calling all_plugins_play to load vars for managed_node1 32935 1726853718.18835: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853718.18838: Calling groups_plugins_play to load vars for managed_node1 32935 1726853718.19102: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853718.19283: done with get_vars() 32935 1726853718.19293: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Friday 20 September 2024 13:35:18 -0400 (0:00:00.024) 0:00:03.329 ****** 32935 1726853718.19386: entering _queue_task() for managed_node1/include_tasks 32935 1726853718.19653: worker is 1 (out of 1 available) 32935 1726853718.19666: exiting _queue_task() for managed_node1/include_tasks 32935 1726853718.19781: done queuing things up, now waiting for results queue to drain 32935 1726853718.19783: waiting for pending results... 32935 1726853718.19936: running TaskExecutor() for managed_node1/TASK: Include the task 'enable_epel.yml' 32935 1726853718.20028: in run() - task 02083763-bbaf-84df-441d-0000000000c6 32935 1726853718.20119: variable 'ansible_search_path' from source: unknown 32935 1726853718.20123: variable 'ansible_search_path' from source: unknown 32935 1726853718.20126: calling self._execute() 32935 1726853718.20143: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853718.20149: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853718.20161: variable 'omit' from source: magic vars 32935 1726853718.20725: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32935 1726853718.22776: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32935 1726853718.22856: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32935 1726853718.22921: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32935 1726853718.22931: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32935 1726853718.22960: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32935 1726853718.23041: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32935 1726853718.23138: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32935 1726853718.23141: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853718.23154: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32935 1726853718.23177: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32935 1726853718.23301: variable '__network_is_ostree' from source: set_fact 32935 1726853718.23326: Evaluated conditional (not __network_is_ostree | d(false)): True 32935 1726853718.23338: _execute() done 32935 1726853718.23346: dumping result to json 32935 1726853718.23359: done dumping result, returning 32935 1726853718.23370: done running TaskExecutor() for managed_node1/TASK: Include the task 'enable_epel.yml' [02083763-bbaf-84df-441d-0000000000c6] 32935 1726853718.23381: sending task result for task 02083763-bbaf-84df-441d-0000000000c6 32935 1726853718.23509: no more pending results, returning what we have 32935 1726853718.23514: in VariableManager get_vars() 32935 1726853718.23551: Calling all_inventory to load vars for managed_node1 32935 1726853718.23554: Calling groups_inventory to load vars for managed_node1 32935 1726853718.23558: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853718.23569: Calling all_plugins_play to load vars for managed_node1 32935 1726853718.23574: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853718.23578: Calling groups_plugins_play to load vars for managed_node1 32935 1726853718.24074: done sending task result for task 02083763-bbaf-84df-441d-0000000000c6 32935 1726853718.24077: WORKER PROCESS EXITING 32935 1726853718.24101: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853718.24281: done with get_vars() 32935 1726853718.24288: variable 'ansible_search_path' from source: unknown 32935 1726853718.24289: variable 'ansible_search_path' from source: unknown 32935 1726853718.24325: we have included files to process 32935 1726853718.24326: generating all_blocks data 32935 1726853718.24328: done generating all_blocks data 32935 1726853718.24333: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 32935 1726853718.24335: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 32935 1726853718.24337: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 32935 1726853718.24996: done processing included file 32935 1726853718.24998: iterating over new_blocks loaded from include file 32935 1726853718.25000: in VariableManager get_vars() 32935 1726853718.25011: done with get_vars() 32935 1726853718.25012: filtering new block on tags 32935 1726853718.25033: done filtering new block on tags 32935 1726853718.25036: in VariableManager get_vars() 32935 1726853718.25045: done with get_vars() 32935 1726853718.25047: filtering new block on tags 32935 1726853718.25058: done filtering new block on tags 32935 1726853718.25059: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed_node1 32935 1726853718.25065: extending task lists for all hosts with included blocks 32935 1726853718.25156: done extending task lists 32935 1726853718.25157: done processing included files 32935 1726853718.25158: results queue empty 32935 1726853718.25159: checking for any_errors_fatal 32935 1726853718.25161: done checking for any_errors_fatal 32935 1726853718.25162: checking for max_fail_percentage 32935 1726853718.25163: done checking for max_fail_percentage 32935 1726853718.25164: checking to see if all hosts have failed and the running result is not ok 32935 1726853718.25165: done checking to see if all hosts have failed 32935 1726853718.25165: getting the remaining hosts for this loop 32935 1726853718.25166: done getting the remaining hosts for this loop 32935 1726853718.25169: getting the next task for host managed_node1 32935 1726853718.25174: done getting next task for host managed_node1 32935 1726853718.25177: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 32935 1726853718.25179: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853718.25181: getting variables 32935 1726853718.25182: in VariableManager get_vars() 32935 1726853718.25190: Calling all_inventory to load vars for managed_node1 32935 1726853718.25192: Calling groups_inventory to load vars for managed_node1 32935 1726853718.25194: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853718.25199: Calling all_plugins_play to load vars for managed_node1 32935 1726853718.25207: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853718.25210: Calling groups_plugins_play to load vars for managed_node1 32935 1726853718.25360: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853718.25545: done with get_vars() 32935 1726853718.25553: done getting variables 32935 1726853718.25618: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 32935 1726853718.25805: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 10] ********************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Friday 20 September 2024 13:35:18 -0400 (0:00:00.064) 0:00:03.394 ****** 32935 1726853718.25850: entering _queue_task() for managed_node1/command 32935 1726853718.25852: Creating lock for command 32935 1726853718.26149: worker is 1 (out of 1 available) 32935 1726853718.26162: exiting _queue_task() for managed_node1/command 32935 1726853718.26277: done queuing things up, now waiting for results queue to drain 32935 1726853718.26279: waiting for pending results... 32935 1726853718.26423: running TaskExecutor() for managed_node1/TASK: Create EPEL 10 32935 1726853718.26578: in run() - task 02083763-bbaf-84df-441d-0000000000e0 32935 1726853718.26581: variable 'ansible_search_path' from source: unknown 32935 1726853718.26583: variable 'ansible_search_path' from source: unknown 32935 1726853718.26589: calling self._execute() 32935 1726853718.26664: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853718.26676: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853718.26690: variable 'omit' from source: magic vars 32935 1726853718.27154: variable 'ansible_distribution' from source: facts 32935 1726853718.27158: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 32935 1726853718.27190: variable 'ansible_distribution_major_version' from source: facts 32935 1726853718.27200: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 32935 1726853718.27207: when evaluation is False, skipping this task 32935 1726853718.27213: _execute() done 32935 1726853718.27219: dumping result to json 32935 1726853718.27226: done dumping result, returning 32935 1726853718.27236: done running TaskExecutor() for managed_node1/TASK: Create EPEL 10 [02083763-bbaf-84df-441d-0000000000e0] 32935 1726853718.27245: sending task result for task 02083763-bbaf-84df-441d-0000000000e0 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 32935 1726853718.27428: no more pending results, returning what we have 32935 1726853718.27432: results queue empty 32935 1726853718.27433: checking for any_errors_fatal 32935 1726853718.27434: done checking for any_errors_fatal 32935 1726853718.27435: checking for max_fail_percentage 32935 1726853718.27437: done checking for max_fail_percentage 32935 1726853718.27438: checking to see if all hosts have failed and the running result is not ok 32935 1726853718.27439: done checking to see if all hosts have failed 32935 1726853718.27440: getting the remaining hosts for this loop 32935 1726853718.27441: done getting the remaining hosts for this loop 32935 1726853718.27444: getting the next task for host managed_node1 32935 1726853718.27453: done getting next task for host managed_node1 32935 1726853718.27456: ^ task is: TASK: Install yum-utils package 32935 1726853718.27460: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853718.27464: getting variables 32935 1726853718.27466: in VariableManager get_vars() 32935 1726853718.27498: Calling all_inventory to load vars for managed_node1 32935 1726853718.27501: Calling groups_inventory to load vars for managed_node1 32935 1726853718.27505: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853718.27519: Calling all_plugins_play to load vars for managed_node1 32935 1726853718.27522: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853718.27525: Calling groups_plugins_play to load vars for managed_node1 32935 1726853718.27916: done sending task result for task 02083763-bbaf-84df-441d-0000000000e0 32935 1726853718.27918: WORKER PROCESS EXITING 32935 1726853718.27937: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853718.28122: done with get_vars() 32935 1726853718.28131: done getting variables 32935 1726853718.28221: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Friday 20 September 2024 13:35:18 -0400 (0:00:00.023) 0:00:03.418 ****** 32935 1726853718.28248: entering _queue_task() for managed_node1/package 32935 1726853718.28250: Creating lock for package 32935 1726853718.28685: worker is 1 (out of 1 available) 32935 1726853718.28693: exiting _queue_task() for managed_node1/package 32935 1726853718.28703: done queuing things up, now waiting for results queue to drain 32935 1726853718.28704: waiting for pending results... 32935 1726853718.28747: running TaskExecutor() for managed_node1/TASK: Install yum-utils package 32935 1726853718.28859: in run() - task 02083763-bbaf-84df-441d-0000000000e1 32935 1726853718.28877: variable 'ansible_search_path' from source: unknown 32935 1726853718.28884: variable 'ansible_search_path' from source: unknown 32935 1726853718.28920: calling self._execute() 32935 1726853718.28995: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853718.29006: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853718.29019: variable 'omit' from source: magic vars 32935 1726853718.29389: variable 'ansible_distribution' from source: facts 32935 1726853718.29407: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 32935 1726853718.29523: variable 'ansible_distribution_major_version' from source: facts 32935 1726853718.29533: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 32935 1726853718.29539: when evaluation is False, skipping this task 32935 1726853718.29545: _execute() done 32935 1726853718.29550: dumping result to json 32935 1726853718.29556: done dumping result, returning 32935 1726853718.29565: done running TaskExecutor() for managed_node1/TASK: Install yum-utils package [02083763-bbaf-84df-441d-0000000000e1] 32935 1726853718.29577: sending task result for task 02083763-bbaf-84df-441d-0000000000e1 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 32935 1726853718.29728: no more pending results, returning what we have 32935 1726853718.29732: results queue empty 32935 1726853718.29733: checking for any_errors_fatal 32935 1726853718.29738: done checking for any_errors_fatal 32935 1726853718.29739: checking for max_fail_percentage 32935 1726853718.29741: done checking for max_fail_percentage 32935 1726853718.29741: checking to see if all hosts have failed and the running result is not ok 32935 1726853718.29743: done checking to see if all hosts have failed 32935 1726853718.29744: getting the remaining hosts for this loop 32935 1726853718.29745: done getting the remaining hosts for this loop 32935 1726853718.29749: getting the next task for host managed_node1 32935 1726853718.29758: done getting next task for host managed_node1 32935 1726853718.29760: ^ task is: TASK: Enable EPEL 7 32935 1726853718.29764: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853718.29768: getting variables 32935 1726853718.29773: in VariableManager get_vars() 32935 1726853718.29806: Calling all_inventory to load vars for managed_node1 32935 1726853718.29809: Calling groups_inventory to load vars for managed_node1 32935 1726853718.29813: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853718.29826: Calling all_plugins_play to load vars for managed_node1 32935 1726853718.29829: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853718.29833: Calling groups_plugins_play to load vars for managed_node1 32935 1726853718.30223: done sending task result for task 02083763-bbaf-84df-441d-0000000000e1 32935 1726853718.30226: WORKER PROCESS EXITING 32935 1726853718.30250: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853718.30435: done with get_vars() 32935 1726853718.30445: done getting variables 32935 1726853718.30502: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Friday 20 September 2024 13:35:18 -0400 (0:00:00.022) 0:00:03.440 ****** 32935 1726853718.30531: entering _queue_task() for managed_node1/command 32935 1726853718.30975: worker is 1 (out of 1 available) 32935 1726853718.30983: exiting _queue_task() for managed_node1/command 32935 1726853718.30992: done queuing things up, now waiting for results queue to drain 32935 1726853718.30993: waiting for pending results... 32935 1726853718.31042: running TaskExecutor() for managed_node1/TASK: Enable EPEL 7 32935 1726853718.31158: in run() - task 02083763-bbaf-84df-441d-0000000000e2 32935 1726853718.31179: variable 'ansible_search_path' from source: unknown 32935 1726853718.31186: variable 'ansible_search_path' from source: unknown 32935 1726853718.31230: calling self._execute() 32935 1726853718.31303: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853718.31315: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853718.31334: variable 'omit' from source: magic vars 32935 1726853718.31846: variable 'ansible_distribution' from source: facts 32935 1726853718.31858: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 32935 1726853718.31995: variable 'ansible_distribution_major_version' from source: facts 32935 1726853718.32001: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 32935 1726853718.32004: when evaluation is False, skipping this task 32935 1726853718.32007: _execute() done 32935 1726853718.32010: dumping result to json 32935 1726853718.32087: done dumping result, returning 32935 1726853718.32090: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 7 [02083763-bbaf-84df-441d-0000000000e2] 32935 1726853718.32092: sending task result for task 02083763-bbaf-84df-441d-0000000000e2 32935 1726853718.32149: done sending task result for task 02083763-bbaf-84df-441d-0000000000e2 32935 1726853718.32152: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 32935 1726853718.32239: no more pending results, returning what we have 32935 1726853718.32242: results queue empty 32935 1726853718.32243: checking for any_errors_fatal 32935 1726853718.32249: done checking for any_errors_fatal 32935 1726853718.32250: checking for max_fail_percentage 32935 1726853718.32252: done checking for max_fail_percentage 32935 1726853718.32253: checking to see if all hosts have failed and the running result is not ok 32935 1726853718.32254: done checking to see if all hosts have failed 32935 1726853718.32255: getting the remaining hosts for this loop 32935 1726853718.32257: done getting the remaining hosts for this loop 32935 1726853718.32261: getting the next task for host managed_node1 32935 1726853718.32270: done getting next task for host managed_node1 32935 1726853718.32276: ^ task is: TASK: Enable EPEL 8 32935 1726853718.32279: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853718.32283: getting variables 32935 1726853718.32285: in VariableManager get_vars() 32935 1726853718.32313: Calling all_inventory to load vars for managed_node1 32935 1726853718.32315: Calling groups_inventory to load vars for managed_node1 32935 1726853718.32319: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853718.32329: Calling all_plugins_play to load vars for managed_node1 32935 1726853718.32331: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853718.32334: Calling groups_plugins_play to load vars for managed_node1 32935 1726853718.32534: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853718.32724: done with get_vars() 32935 1726853718.32732: done getting variables 32935 1726853718.32790: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Friday 20 September 2024 13:35:18 -0400 (0:00:00.022) 0:00:03.463 ****** 32935 1726853718.32819: entering _queue_task() for managed_node1/command 32935 1726853718.33082: worker is 1 (out of 1 available) 32935 1726853718.33095: exiting _queue_task() for managed_node1/command 32935 1726853718.33109: done queuing things up, now waiting for results queue to drain 32935 1726853718.33111: waiting for pending results... 32935 1726853718.33389: running TaskExecutor() for managed_node1/TASK: Enable EPEL 8 32935 1726853718.33553: in run() - task 02083763-bbaf-84df-441d-0000000000e3 32935 1726853718.33558: variable 'ansible_search_path' from source: unknown 32935 1726853718.33564: variable 'ansible_search_path' from source: unknown 32935 1726853718.33568: calling self._execute() 32935 1726853718.33600: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853718.33609: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853718.33615: variable 'omit' from source: magic vars 32935 1726853718.34031: variable 'ansible_distribution' from source: facts 32935 1726853718.34048: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 32935 1726853718.34198: variable 'ansible_distribution_major_version' from source: facts 32935 1726853718.34209: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 32935 1726853718.34227: when evaluation is False, skipping this task 32935 1726853718.34240: _execute() done 32935 1726853718.34248: dumping result to json 32935 1726853718.34256: done dumping result, returning 32935 1726853718.34269: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 8 [02083763-bbaf-84df-441d-0000000000e3] 32935 1726853718.34282: sending task result for task 02083763-bbaf-84df-441d-0000000000e3 32935 1726853718.34503: done sending task result for task 02083763-bbaf-84df-441d-0000000000e3 32935 1726853718.34506: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 32935 1726853718.34555: no more pending results, returning what we have 32935 1726853718.34561: results queue empty 32935 1726853718.34563: checking for any_errors_fatal 32935 1726853718.34567: done checking for any_errors_fatal 32935 1726853718.34568: checking for max_fail_percentage 32935 1726853718.34570: done checking for max_fail_percentage 32935 1726853718.34570: checking to see if all hosts have failed and the running result is not ok 32935 1726853718.34574: done checking to see if all hosts have failed 32935 1726853718.34575: getting the remaining hosts for this loop 32935 1726853718.34576: done getting the remaining hosts for this loop 32935 1726853718.34580: getting the next task for host managed_node1 32935 1726853718.34591: done getting next task for host managed_node1 32935 1726853718.34594: ^ task is: TASK: Enable EPEL 6 32935 1726853718.34597: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853718.34608: getting variables 32935 1726853718.34611: in VariableManager get_vars() 32935 1726853718.34640: Calling all_inventory to load vars for managed_node1 32935 1726853718.34642: Calling groups_inventory to load vars for managed_node1 32935 1726853718.34646: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853718.34656: Calling all_plugins_play to load vars for managed_node1 32935 1726853718.34660: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853718.34663: Calling groups_plugins_play to load vars for managed_node1 32935 1726853718.34844: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853718.35066: done with get_vars() 32935 1726853718.35078: done getting variables 32935 1726853718.35133: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Friday 20 September 2024 13:35:18 -0400 (0:00:00.023) 0:00:03.487 ****** 32935 1726853718.35177: entering _queue_task() for managed_node1/copy 32935 1726853718.35446: worker is 1 (out of 1 available) 32935 1726853718.35462: exiting _queue_task() for managed_node1/copy 32935 1726853718.35685: done queuing things up, now waiting for results queue to drain 32935 1726853718.35687: waiting for pending results... 32935 1726853718.35807: running TaskExecutor() for managed_node1/TASK: Enable EPEL 6 32935 1726853718.35864: in run() - task 02083763-bbaf-84df-441d-0000000000e5 32935 1726853718.35886: variable 'ansible_search_path' from source: unknown 32935 1726853718.35903: variable 'ansible_search_path' from source: unknown 32935 1726853718.35947: calling self._execute() 32935 1726853718.36034: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853718.36047: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853718.36065: variable 'omit' from source: magic vars 32935 1726853718.36581: variable 'ansible_distribution' from source: facts 32935 1726853718.36669: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 32935 1726853718.36732: variable 'ansible_distribution_major_version' from source: facts 32935 1726853718.36744: Evaluated conditional (ansible_distribution_major_version == '6'): False 32935 1726853718.36751: when evaluation is False, skipping this task 32935 1726853718.36761: _execute() done 32935 1726853718.36780: dumping result to json 32935 1726853718.36791: done dumping result, returning 32935 1726853718.36801: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 6 [02083763-bbaf-84df-441d-0000000000e5] 32935 1726853718.36809: sending task result for task 02083763-bbaf-84df-441d-0000000000e5 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 32935 1726853718.37069: no more pending results, returning what we have 32935 1726853718.37076: results queue empty 32935 1726853718.37078: checking for any_errors_fatal 32935 1726853718.37082: done checking for any_errors_fatal 32935 1726853718.37083: checking for max_fail_percentage 32935 1726853718.37085: done checking for max_fail_percentage 32935 1726853718.37086: checking to see if all hosts have failed and the running result is not ok 32935 1726853718.37087: done checking to see if all hosts have failed 32935 1726853718.37092: getting the remaining hosts for this loop 32935 1726853718.37100: done getting the remaining hosts for this loop 32935 1726853718.37104: getting the next task for host managed_node1 32935 1726853718.37115: done getting next task for host managed_node1 32935 1726853718.37117: ^ task is: TASK: Set network provider to 'nm' 32935 1726853718.37120: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853718.37125: getting variables 32935 1726853718.37127: in VariableManager get_vars() 32935 1726853718.37156: Calling all_inventory to load vars for managed_node1 32935 1726853718.37162: Calling groups_inventory to load vars for managed_node1 32935 1726853718.37166: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853718.37181: Calling all_plugins_play to load vars for managed_node1 32935 1726853718.37185: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853718.37189: Calling groups_plugins_play to load vars for managed_node1 32935 1726853718.37320: done sending task result for task 02083763-bbaf-84df-441d-0000000000e5 32935 1726853718.37324: WORKER PROCESS EXITING 32935 1726853718.37603: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853718.37816: done with get_vars() 32935 1726853718.37824: done getting variables 32935 1726853718.37895: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_vlan_mtu_nm.yml:13 Friday 20 September 2024 13:35:18 -0400 (0:00:00.027) 0:00:03.514 ****** 32935 1726853718.37926: entering _queue_task() for managed_node1/set_fact 32935 1726853718.38253: worker is 1 (out of 1 available) 32935 1726853718.38269: exiting _queue_task() for managed_node1/set_fact 32935 1726853718.38355: done queuing things up, now waiting for results queue to drain 32935 1726853718.38360: waiting for pending results... 32935 1726853718.38512: running TaskExecutor() for managed_node1/TASK: Set network provider to 'nm' 32935 1726853718.38613: in run() - task 02083763-bbaf-84df-441d-000000000007 32935 1726853718.38643: variable 'ansible_search_path' from source: unknown 32935 1726853718.38695: calling self._execute() 32935 1726853718.38778: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853718.38794: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853718.38815: variable 'omit' from source: magic vars 32935 1726853718.38940: variable 'omit' from source: magic vars 32935 1726853718.38988: variable 'omit' from source: magic vars 32935 1726853718.39040: variable 'omit' from source: magic vars 32935 1726853718.39113: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32935 1726853718.39169: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32935 1726853718.39186: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32935 1726853718.39206: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853718.39236: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853718.39265: variable 'inventory_hostname' from source: host vars for 'managed_node1' 32935 1726853718.39330: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853718.39333: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853718.39416: Set connection var ansible_timeout to 10 32935 1726853718.39429: Set connection var ansible_shell_type to sh 32935 1726853718.39453: Set connection var ansible_pipelining to False 32935 1726853718.39465: Set connection var ansible_connection to ssh 32935 1726853718.39479: Set connection var ansible_shell_executable to /bin/sh 32935 1726853718.39498: Set connection var ansible_module_compression to ZIP_DEFLATED 32935 1726853718.39547: variable 'ansible_shell_executable' from source: unknown 32935 1726853718.39551: variable 'ansible_connection' from source: unknown 32935 1726853718.39553: variable 'ansible_module_compression' from source: unknown 32935 1726853718.39560: variable 'ansible_shell_type' from source: unknown 32935 1726853718.39565: variable 'ansible_shell_executable' from source: unknown 32935 1726853718.39568: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853718.39570: variable 'ansible_pipelining' from source: unknown 32935 1726853718.39604: variable 'ansible_timeout' from source: unknown 32935 1726853718.39608: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853718.39821: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32935 1726853718.39826: variable 'omit' from source: magic vars 32935 1726853718.39829: starting attempt loop 32935 1726853718.39831: running the handler 32935 1726853718.39833: handler run complete 32935 1726853718.39837: attempt loop complete, returning result 32935 1726853718.39839: _execute() done 32935 1726853718.39841: dumping result to json 32935 1726853718.39850: done dumping result, returning 32935 1726853718.39878: done running TaskExecutor() for managed_node1/TASK: Set network provider to 'nm' [02083763-bbaf-84df-441d-000000000007] 32935 1726853718.39931: sending task result for task 02083763-bbaf-84df-441d-000000000007 32935 1726853718.40062: done sending task result for task 02083763-bbaf-84df-441d-000000000007 32935 1726853718.40066: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 32935 1726853718.40126: no more pending results, returning what we have 32935 1726853718.40129: results queue empty 32935 1726853718.40130: checking for any_errors_fatal 32935 1726853718.40135: done checking for any_errors_fatal 32935 1726853718.40136: checking for max_fail_percentage 32935 1726853718.40138: done checking for max_fail_percentage 32935 1726853718.40139: checking to see if all hosts have failed and the running result is not ok 32935 1726853718.40142: done checking to see if all hosts have failed 32935 1726853718.40142: getting the remaining hosts for this loop 32935 1726853718.40144: done getting the remaining hosts for this loop 32935 1726853718.40147: getting the next task for host managed_node1 32935 1726853718.40155: done getting next task for host managed_node1 32935 1726853718.40159: ^ task is: TASK: meta (flush_handlers) 32935 1726853718.40161: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853718.40167: getting variables 32935 1726853718.40169: in VariableManager get_vars() 32935 1726853718.40199: Calling all_inventory to load vars for managed_node1 32935 1726853718.40203: Calling groups_inventory to load vars for managed_node1 32935 1726853718.40206: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853718.40217: Calling all_plugins_play to load vars for managed_node1 32935 1726853718.40220: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853718.40223: Calling groups_plugins_play to load vars for managed_node1 32935 1726853718.40575: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853718.40818: done with get_vars() 32935 1726853718.40827: done getting variables 32935 1726853718.40893: in VariableManager get_vars() 32935 1726853718.40900: Calling all_inventory to load vars for managed_node1 32935 1726853718.40901: Calling groups_inventory to load vars for managed_node1 32935 1726853718.40902: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853718.40905: Calling all_plugins_play to load vars for managed_node1 32935 1726853718.40906: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853718.40908: Calling groups_plugins_play to load vars for managed_node1 32935 1726853718.41005: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853718.41188: done with get_vars() 32935 1726853718.41200: done queuing things up, now waiting for results queue to drain 32935 1726853718.41202: results queue empty 32935 1726853718.41203: checking for any_errors_fatal 32935 1726853718.41205: done checking for any_errors_fatal 32935 1726853718.41205: checking for max_fail_percentage 32935 1726853718.41206: done checking for max_fail_percentage 32935 1726853718.41207: checking to see if all hosts have failed and the running result is not ok 32935 1726853718.41208: done checking to see if all hosts have failed 32935 1726853718.41209: getting the remaining hosts for this loop 32935 1726853718.41209: done getting the remaining hosts for this loop 32935 1726853718.41211: getting the next task for host managed_node1 32935 1726853718.41215: done getting next task for host managed_node1 32935 1726853718.41216: ^ task is: TASK: meta (flush_handlers) 32935 1726853718.41217: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853718.41224: getting variables 32935 1726853718.41225: in VariableManager get_vars() 32935 1726853718.41232: Calling all_inventory to load vars for managed_node1 32935 1726853718.41234: Calling groups_inventory to load vars for managed_node1 32935 1726853718.41236: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853718.41240: Calling all_plugins_play to load vars for managed_node1 32935 1726853718.41243: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853718.41245: Calling groups_plugins_play to load vars for managed_node1 32935 1726853718.41379: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853718.41762: done with get_vars() 32935 1726853718.41769: done getting variables 32935 1726853718.41814: in VariableManager get_vars() 32935 1726853718.41827: Calling all_inventory to load vars for managed_node1 32935 1726853718.41829: Calling groups_inventory to load vars for managed_node1 32935 1726853718.41832: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853718.41837: Calling all_plugins_play to load vars for managed_node1 32935 1726853718.41839: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853718.41842: Calling groups_plugins_play to load vars for managed_node1 32935 1726853718.41969: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853718.42154: done with get_vars() 32935 1726853718.42165: done queuing things up, now waiting for results queue to drain 32935 1726853718.42167: results queue empty 32935 1726853718.42168: checking for any_errors_fatal 32935 1726853718.42169: done checking for any_errors_fatal 32935 1726853718.42169: checking for max_fail_percentage 32935 1726853718.42172: done checking for max_fail_percentage 32935 1726853718.42173: checking to see if all hosts have failed and the running result is not ok 32935 1726853718.42173: done checking to see if all hosts have failed 32935 1726853718.42174: getting the remaining hosts for this loop 32935 1726853718.42175: done getting the remaining hosts for this loop 32935 1726853718.42177: getting the next task for host managed_node1 32935 1726853718.42180: done getting next task for host managed_node1 32935 1726853718.42181: ^ task is: None 32935 1726853718.42182: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853718.42183: done queuing things up, now waiting for results queue to drain 32935 1726853718.42184: results queue empty 32935 1726853718.42185: checking for any_errors_fatal 32935 1726853718.42185: done checking for any_errors_fatal 32935 1726853718.42186: checking for max_fail_percentage 32935 1726853718.42187: done checking for max_fail_percentage 32935 1726853718.42187: checking to see if all hosts have failed and the running result is not ok 32935 1726853718.42188: done checking to see if all hosts have failed 32935 1726853718.42190: getting the next task for host managed_node1 32935 1726853718.42192: done getting next task for host managed_node1 32935 1726853718.42193: ^ task is: None 32935 1726853718.42194: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853718.42232: in VariableManager get_vars() 32935 1726853718.42256: done with get_vars() 32935 1726853718.42268: in VariableManager get_vars() 32935 1726853718.42286: done with get_vars() 32935 1726853718.42290: variable 'omit' from source: magic vars 32935 1726853718.42322: in VariableManager get_vars() 32935 1726853718.42337: done with get_vars() 32935 1726853718.42357: variable 'omit' from source: magic vars PLAY [Play for testing vlan mtu setting] *************************************** 32935 1726853718.42732: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 32935 1726853718.42757: getting the remaining hosts for this loop 32935 1726853718.42758: done getting the remaining hosts for this loop 32935 1726853718.42761: getting the next task for host managed_node1 32935 1726853718.42763: done getting next task for host managed_node1 32935 1726853718.42765: ^ task is: TASK: Gathering Facts 32935 1726853718.42767: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853718.42768: getting variables 32935 1726853718.42769: in VariableManager get_vars() 32935 1726853718.42783: Calling all_inventory to load vars for managed_node1 32935 1726853718.42785: Calling groups_inventory to load vars for managed_node1 32935 1726853718.42787: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853718.42793: Calling all_plugins_play to load vars for managed_node1 32935 1726853718.42810: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853718.42814: Calling groups_plugins_play to load vars for managed_node1 32935 1726853718.42953: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853718.43194: done with get_vars() 32935 1726853718.43202: done getting variables 32935 1726853718.43245: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_vlan_mtu.yml:3 Friday 20 September 2024 13:35:18 -0400 (0:00:00.053) 0:00:03.568 ****** 32935 1726853718.43268: entering _queue_task() for managed_node1/gather_facts 32935 1726853718.43773: worker is 1 (out of 1 available) 32935 1726853718.43780: exiting _queue_task() for managed_node1/gather_facts 32935 1726853718.43789: done queuing things up, now waiting for results queue to drain 32935 1726853718.43791: waiting for pending results... 32935 1726853718.43829: running TaskExecutor() for managed_node1/TASK: Gathering Facts 32935 1726853718.43976: in run() - task 02083763-bbaf-84df-441d-00000000010b 32935 1726853718.43981: variable 'ansible_search_path' from source: unknown 32935 1726853718.43988: calling self._execute() 32935 1726853718.44079: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853718.44091: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853718.44106: variable 'omit' from source: magic vars 32935 1726853718.44565: variable 'ansible_distribution_major_version' from source: facts 32935 1726853718.44569: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853718.44573: variable 'omit' from source: magic vars 32935 1726853718.44576: variable 'omit' from source: magic vars 32935 1726853718.44585: variable 'omit' from source: magic vars 32935 1726853718.44628: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32935 1726853718.44678: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32935 1726853718.44705: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32935 1726853718.44727: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853718.44744: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853718.44785: variable 'inventory_hostname' from source: host vars for 'managed_node1' 32935 1726853718.44794: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853718.44802: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853718.44906: Set connection var ansible_timeout to 10 32935 1726853718.44917: Set connection var ansible_shell_type to sh 32935 1726853718.44928: Set connection var ansible_pipelining to False 32935 1726853718.44975: Set connection var ansible_connection to ssh 32935 1726853718.44979: Set connection var ansible_shell_executable to /bin/sh 32935 1726853718.44981: Set connection var ansible_module_compression to ZIP_DEFLATED 32935 1726853718.44983: variable 'ansible_shell_executable' from source: unknown 32935 1726853718.44985: variable 'ansible_connection' from source: unknown 32935 1726853718.44993: variable 'ansible_module_compression' from source: unknown 32935 1726853718.45001: variable 'ansible_shell_type' from source: unknown 32935 1726853718.45008: variable 'ansible_shell_executable' from source: unknown 32935 1726853718.45015: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853718.45022: variable 'ansible_pipelining' from source: unknown 32935 1726853718.45028: variable 'ansible_timeout' from source: unknown 32935 1726853718.45037: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853718.45325: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32935 1726853718.45329: variable 'omit' from source: magic vars 32935 1726853718.45332: starting attempt loop 32935 1726853718.45335: running the handler 32935 1726853718.45337: variable 'ansible_facts' from source: unknown 32935 1726853718.45339: _low_level_execute_command(): starting 32935 1726853718.45341: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32935 1726853718.46064: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32935 1726853718.46095: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853718.46198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853718.46216: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853718.46234: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853718.46258: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853718.46340: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853718.48692: stdout chunk (state=3): >>>/root <<< 32935 1726853718.48887: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853718.48902: stdout chunk (state=3): >>><<< 32935 1726853718.48921: stderr chunk (state=3): >>><<< 32935 1726853718.48948: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853718.48973: _low_level_execute_command(): starting 32935 1726853718.49067: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853718.489555-33141-168740778191963 `" && echo ansible-tmp-1726853718.489555-33141-168740778191963="` echo /root/.ansible/tmp/ansible-tmp-1726853718.489555-33141-168740778191963 `" ) && sleep 0' 32935 1726853718.49668: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32935 1726853718.49686: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853718.49702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853718.49733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32935 1726853718.49841: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853718.49899: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853718.49941: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853718.52753: stdout chunk (state=3): >>>ansible-tmp-1726853718.489555-33141-168740778191963=/root/.ansible/tmp/ansible-tmp-1726853718.489555-33141-168740778191963 <<< 32935 1726853718.52978: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853718.52981: stdout chunk (state=3): >>><<< 32935 1726853718.52984: stderr chunk (state=3): >>><<< 32935 1726853718.53001: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853718.489555-33141-168740778191963=/root/.ansible/tmp/ansible-tmp-1726853718.489555-33141-168740778191963 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853718.53184: variable 'ansible_module_compression' from source: unknown 32935 1726853718.53188: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-32935vj31k4ae/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 32935 1726853718.53190: variable 'ansible_facts' from source: unknown 32935 1726853718.53468: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853718.489555-33141-168740778191963/AnsiballZ_setup.py 32935 1726853718.53643: Sending initial data 32935 1726853718.53653: Sent initial data (153 bytes) 32935 1726853718.54386: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853718.54437: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853718.54454: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853718.54477: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853718.54552: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853718.56810: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32935 1726853718.56848: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32935 1726853718.56888: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32935vj31k4ae/tmp61tm_jrk /root/.ansible/tmp/ansible-tmp-1726853718.489555-33141-168740778191963/AnsiballZ_setup.py <<< 32935 1726853718.56897: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853718.489555-33141-168740778191963/AnsiballZ_setup.py" <<< 32935 1726853718.56955: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-32935vj31k4ae/tmp61tm_jrk" to remote "/root/.ansible/tmp/ansible-tmp-1726853718.489555-33141-168740778191963/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853718.489555-33141-168740778191963/AnsiballZ_setup.py" <<< 32935 1726853718.58716: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853718.58728: stdout chunk (state=3): >>><<< 32935 1726853718.58744: stderr chunk (state=3): >>><<< 32935 1726853718.58774: done transferring module to remote 32935 1726853718.58792: _low_level_execute_command(): starting 32935 1726853718.58803: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853718.489555-33141-168740778191963/ /root/.ansible/tmp/ansible-tmp-1726853718.489555-33141-168740778191963/AnsiballZ_setup.py && sleep 0' 32935 1726853718.60134: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853718.60148: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address <<< 32935 1726853718.60187: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853718.60296: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853718.60343: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853718.62817: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853718.62920: stderr chunk (state=3): >>><<< 32935 1726853718.62924: stdout chunk (state=3): >>><<< 32935 1726853718.63018: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853718.63031: _low_level_execute_command(): starting 32935 1726853718.63034: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853718.489555-33141-168740778191963/AnsiballZ_setup.py && sleep 0' 32935 1726853718.63986: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853718.63989: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853718.64242: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853718.64278: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853718.64288: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853718.64780: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853719.52528: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 44238 10.31.45.153 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 44238 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "35", "second": "19", "epoch": "1726853719", "epoch_int": "1726853719", "date": "2024-09-20", "time": "13:35:19", "iso8601_micro": "2024-09-20T17:35:19.042602Z", "iso8601": "2024-09-20T17:35:19Z", "iso8601_basic": "20240920T133519042602", "iso8601_basic_short": "20240920T133519", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-153", "ansible_nodename": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26b9e88796a7cb9ebdc2656ce384f6", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCXYnrsBaUY4i0/t1QUWoZkFPnFbGncbkmF01/zUZNuwldCwqYoDxpR2K8ARraEuK9oVLyYO0alCszdGP42db6R4xfRCOhN3faeZXsneupsJk4LLpIBkq0uIokeAtcL1tPOUQQzfsQqzZzp4BmJCVrwmUW5ADnzqCgvB3gsruyTQUrEUJ9MtB5zdaQm5MXuipjeZQThTjYCB2aXv/qTdzfKAwds3CoSZ6HA5GNdi6tahsy3CRIn6VtVkvwrqjJcwo+RaRQzjh+C9AUoH2YQmfLbvog62MsnLk/5OPq5HhxO81pm/TJDsI4LXwLh1VviMOWzVvIaXuKwdmYAdgX1NU561bBzeYzi55qBKo4TcMmnOXiV+Es7dDrKjwwpQKsv5tjSqVkcO6ek3I6SI38DXFOBLZtqXOOLsO12iOReYJUWe/+cgBrz12kDCPmaHFzFFZ3+N0GQ/WiYcgqiUItIhb3xJTbPqls0czPCpCzKo57GyTmv17fpfGhBfRoGX/H1zYs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDOnt+7F+RlMaCRRXs8YyuhiHP1FmeDlj4rNB/K2mg1iP9loXXc/XjJ083xMBDu7m7uYLGB+dnmj299Y+RcAQpE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKmxoIJtw8UORlc+o+Q7Pks5ERSyhMLnl+Oo8W221WGj", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_iscsi_iqn": "", "ansible_lsb": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_fips": false, "ansible_apparmor": {"status": "disabled"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::3a:e7ff:fe40:bc9f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.153"], "ansible_all_ipv6_addresses": ["fe80::3a:e7ff:fe40:bc9f"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.153", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::3a:e7ff:fe40:bc9f"]}, "ansible_is_chroot": false, "ansible_fibre_channel_wwn": [], "ansible_local": {}, "ansible_loadavg": {"1m": 0.669921875, "5m": 0.453125, "15m": 0.255859375}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2938, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 593, "free": 2938}, "nocache": {"free": 3297, "used": 234}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_uuid": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 885, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261771161600, "block_size": 4096, "block_total": 65519099, "block_available": 63908975, "block_used": 1610124, "inode_total": 131070960, "inode_available": 131028924, "inode_used": 42036, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 32935 1726853719.54751: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 32935 1726853719.54755: stdout chunk (state=3): >>><<< 32935 1726853719.54757: stderr chunk (state=3): >>><<< 32935 1726853719.54760: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 44238 10.31.45.153 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 44238 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "35", "second": "19", "epoch": "1726853719", "epoch_int": "1726853719", "date": "2024-09-20", "time": "13:35:19", "iso8601_micro": "2024-09-20T17:35:19.042602Z", "iso8601": "2024-09-20T17:35:19Z", "iso8601_basic": "20240920T133519042602", "iso8601_basic_short": "20240920T133519", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-153", "ansible_nodename": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26b9e88796a7cb9ebdc2656ce384f6", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCXYnrsBaUY4i0/t1QUWoZkFPnFbGncbkmF01/zUZNuwldCwqYoDxpR2K8ARraEuK9oVLyYO0alCszdGP42db6R4xfRCOhN3faeZXsneupsJk4LLpIBkq0uIokeAtcL1tPOUQQzfsQqzZzp4BmJCVrwmUW5ADnzqCgvB3gsruyTQUrEUJ9MtB5zdaQm5MXuipjeZQThTjYCB2aXv/qTdzfKAwds3CoSZ6HA5GNdi6tahsy3CRIn6VtVkvwrqjJcwo+RaRQzjh+C9AUoH2YQmfLbvog62MsnLk/5OPq5HhxO81pm/TJDsI4LXwLh1VviMOWzVvIaXuKwdmYAdgX1NU561bBzeYzi55qBKo4TcMmnOXiV+Es7dDrKjwwpQKsv5tjSqVkcO6ek3I6SI38DXFOBLZtqXOOLsO12iOReYJUWe/+cgBrz12kDCPmaHFzFFZ3+N0GQ/WiYcgqiUItIhb3xJTbPqls0czPCpCzKo57GyTmv17fpfGhBfRoGX/H1zYs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDOnt+7F+RlMaCRRXs8YyuhiHP1FmeDlj4rNB/K2mg1iP9loXXc/XjJ083xMBDu7m7uYLGB+dnmj299Y+RcAQpE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKmxoIJtw8UORlc+o+Q7Pks5ERSyhMLnl+Oo8W221WGj", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_iscsi_iqn": "", "ansible_lsb": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_fips": false, "ansible_apparmor": {"status": "disabled"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::3a:e7ff:fe40:bc9f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.153"], "ansible_all_ipv6_addresses": ["fe80::3a:e7ff:fe40:bc9f"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.153", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::3a:e7ff:fe40:bc9f"]}, "ansible_is_chroot": false, "ansible_fibre_channel_wwn": [], "ansible_local": {}, "ansible_loadavg": {"1m": 0.669921875, "5m": 0.453125, "15m": 0.255859375}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2938, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 593, "free": 2938}, "nocache": {"free": 3297, "used": 234}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_uuid": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 885, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261771161600, "block_size": 4096, "block_total": 65519099, "block_available": 63908975, "block_used": 1610124, "inode_total": 131070960, "inode_available": 131028924, "inode_used": 42036, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 32935 1726853719.55546: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853718.489555-33141-168740778191963/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32935 1726853719.55550: _low_level_execute_command(): starting 32935 1726853719.55552: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853718.489555-33141-168740778191963/ > /dev/null 2>&1 && sleep 0' 32935 1726853719.56896: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32935 1726853719.56913: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853719.56928: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853719.57104: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853719.57247: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853719.57344: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853719.59274: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853719.59279: stdout chunk (state=3): >>><<< 32935 1726853719.59281: stderr chunk (state=3): >>><<< 32935 1726853719.59481: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853719.59484: handler run complete 32935 1726853719.59536: variable 'ansible_facts' from source: unknown 32935 1726853719.59795: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853719.61023: variable 'ansible_facts' from source: unknown 32935 1726853719.61279: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853719.61433: attempt loop complete, returning result 32935 1726853719.61447: _execute() done 32935 1726853719.61455: dumping result to json 32935 1726853719.61499: done dumping result, returning 32935 1726853719.61512: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [02083763-bbaf-84df-441d-00000000010b] 32935 1726853719.61521: sending task result for task 02083763-bbaf-84df-441d-00000000010b ok: [managed_node1] 32935 1726853719.62641: no more pending results, returning what we have 32935 1726853719.62643: results queue empty 32935 1726853719.62644: checking for any_errors_fatal 32935 1726853719.62645: done checking for any_errors_fatal 32935 1726853719.62646: checking for max_fail_percentage 32935 1726853719.62647: done checking for max_fail_percentage 32935 1726853719.62648: checking to see if all hosts have failed and the running result is not ok 32935 1726853719.62648: done checking to see if all hosts have failed 32935 1726853719.62649: getting the remaining hosts for this loop 32935 1726853719.62650: done getting the remaining hosts for this loop 32935 1726853719.62653: getting the next task for host managed_node1 32935 1726853719.62660: done getting next task for host managed_node1 32935 1726853719.62662: ^ task is: TASK: meta (flush_handlers) 32935 1726853719.62663: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853719.62667: getting variables 32935 1726853719.62668: in VariableManager get_vars() 32935 1726853719.62697: Calling all_inventory to load vars for managed_node1 32935 1726853719.62700: Calling groups_inventory to load vars for managed_node1 32935 1726853719.62702: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853719.62711: Calling all_plugins_play to load vars for managed_node1 32935 1726853719.62714: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853719.62716: Calling groups_plugins_play to load vars for managed_node1 32935 1726853719.62868: done sending task result for task 02083763-bbaf-84df-441d-00000000010b 32935 1726853719.62876: WORKER PROCESS EXITING 32935 1726853719.62899: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853719.63119: done with get_vars() 32935 1726853719.63131: done getting variables 32935 1726853719.63203: in VariableManager get_vars() 32935 1726853719.63217: Calling all_inventory to load vars for managed_node1 32935 1726853719.63219: Calling groups_inventory to load vars for managed_node1 32935 1726853719.63221: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853719.63226: Calling all_plugins_play to load vars for managed_node1 32935 1726853719.63228: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853719.63230: Calling groups_plugins_play to load vars for managed_node1 32935 1726853719.63391: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853719.63587: done with get_vars() 32935 1726853719.63600: done queuing things up, now waiting for results queue to drain 32935 1726853719.63602: results queue empty 32935 1726853719.63602: checking for any_errors_fatal 32935 1726853719.63605: done checking for any_errors_fatal 32935 1726853719.63606: checking for max_fail_percentage 32935 1726853719.63607: done checking for max_fail_percentage 32935 1726853719.63608: checking to see if all hosts have failed and the running result is not ok 32935 1726853719.63613: done checking to see if all hosts have failed 32935 1726853719.63613: getting the remaining hosts for this loop 32935 1726853719.63614: done getting the remaining hosts for this loop 32935 1726853719.63617: getting the next task for host managed_node1 32935 1726853719.63620: done getting next task for host managed_node1 32935 1726853719.63623: ^ task is: TASK: Include the task 'show_interfaces.yml' 32935 1726853719.63624: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853719.63626: getting variables 32935 1726853719.63627: in VariableManager get_vars() 32935 1726853719.63638: Calling all_inventory to load vars for managed_node1 32935 1726853719.63640: Calling groups_inventory to load vars for managed_node1 32935 1726853719.63642: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853719.63646: Calling all_plugins_play to load vars for managed_node1 32935 1726853719.63649: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853719.63651: Calling groups_plugins_play to load vars for managed_node1 32935 1726853719.63792: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853719.63996: done with get_vars() 32935 1726853719.64004: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_vlan_mtu.yml:10 Friday 20 September 2024 13:35:19 -0400 (0:00:01.208) 0:00:04.776 ****** 32935 1726853719.64081: entering _queue_task() for managed_node1/include_tasks 32935 1726853719.64480: worker is 1 (out of 1 available) 32935 1726853719.64492: exiting _queue_task() for managed_node1/include_tasks 32935 1726853719.64501: done queuing things up, now waiting for results queue to drain 32935 1726853719.64503: waiting for pending results... 32935 1726853719.64675: running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' 32935 1726853719.64770: in run() - task 02083763-bbaf-84df-441d-00000000000b 32935 1726853719.64795: variable 'ansible_search_path' from source: unknown 32935 1726853719.64834: calling self._execute() 32935 1726853719.64939: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853719.65009: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853719.65013: variable 'omit' from source: magic vars 32935 1726853719.65363: variable 'ansible_distribution_major_version' from source: facts 32935 1726853719.65383: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853719.65394: _execute() done 32935 1726853719.65402: dumping result to json 32935 1726853719.65409: done dumping result, returning 32935 1726853719.65418: done running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' [02083763-bbaf-84df-441d-00000000000b] 32935 1726853719.65426: sending task result for task 02083763-bbaf-84df-441d-00000000000b 32935 1726853719.65687: no more pending results, returning what we have 32935 1726853719.65692: in VariableManager get_vars() 32935 1726853719.65737: Calling all_inventory to load vars for managed_node1 32935 1726853719.65739: Calling groups_inventory to load vars for managed_node1 32935 1726853719.65742: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853719.65756: Calling all_plugins_play to load vars for managed_node1 32935 1726853719.65759: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853719.65762: Calling groups_plugins_play to load vars for managed_node1 32935 1726853719.66063: done sending task result for task 02083763-bbaf-84df-441d-00000000000b 32935 1726853719.66067: WORKER PROCESS EXITING 32935 1726853719.66096: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853719.66333: done with get_vars() 32935 1726853719.66339: variable 'ansible_search_path' from source: unknown 32935 1726853719.66352: we have included files to process 32935 1726853719.66353: generating all_blocks data 32935 1726853719.66354: done generating all_blocks data 32935 1726853719.66355: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 32935 1726853719.66356: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 32935 1726853719.66358: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 32935 1726853719.66511: in VariableManager get_vars() 32935 1726853719.66536: done with get_vars() 32935 1726853719.66643: done processing included file 32935 1726853719.66646: iterating over new_blocks loaded from include file 32935 1726853719.66647: in VariableManager get_vars() 32935 1726853719.66663: done with get_vars() 32935 1726853719.66664: filtering new block on tags 32935 1726853719.66682: done filtering new block on tags 32935 1726853719.66684: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node1 32935 1726853719.66689: extending task lists for all hosts with included blocks 32935 1726853719.68968: done extending task lists 32935 1726853719.68970: done processing included files 32935 1726853719.68972: results queue empty 32935 1726853719.68973: checking for any_errors_fatal 32935 1726853719.68975: done checking for any_errors_fatal 32935 1726853719.68976: checking for max_fail_percentage 32935 1726853719.68977: done checking for max_fail_percentage 32935 1726853719.68978: checking to see if all hosts have failed and the running result is not ok 32935 1726853719.68979: done checking to see if all hosts have failed 32935 1726853719.68979: getting the remaining hosts for this loop 32935 1726853719.68981: done getting the remaining hosts for this loop 32935 1726853719.68983: getting the next task for host managed_node1 32935 1726853719.68988: done getting next task for host managed_node1 32935 1726853719.68990: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 32935 1726853719.68993: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853719.68995: getting variables 32935 1726853719.68997: in VariableManager get_vars() 32935 1726853719.69020: Calling all_inventory to load vars for managed_node1 32935 1726853719.69023: Calling groups_inventory to load vars for managed_node1 32935 1726853719.69026: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853719.69033: Calling all_plugins_play to load vars for managed_node1 32935 1726853719.69036: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853719.69039: Calling groups_plugins_play to load vars for managed_node1 32935 1726853719.69198: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853719.69396: done with get_vars() 32935 1726853719.69408: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 13:35:19 -0400 (0:00:00.054) 0:00:04.830 ****** 32935 1726853719.69495: entering _queue_task() for managed_node1/include_tasks 32935 1726853719.69834: worker is 1 (out of 1 available) 32935 1726853719.69849: exiting _queue_task() for managed_node1/include_tasks 32935 1726853719.69866: done queuing things up, now waiting for results queue to drain 32935 1726853719.69868: waiting for pending results... 32935 1726853719.70140: running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' 32935 1726853719.70269: in run() - task 02083763-bbaf-84df-441d-000000000120 32935 1726853719.70293: variable 'ansible_search_path' from source: unknown 32935 1726853719.70308: variable 'ansible_search_path' from source: unknown 32935 1726853719.70349: calling self._execute() 32935 1726853719.70445: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853719.70460: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853719.70479: variable 'omit' from source: magic vars 32935 1726853719.70936: variable 'ansible_distribution_major_version' from source: facts 32935 1726853719.70962: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853719.70979: _execute() done 32935 1726853719.70988: dumping result to json 32935 1726853719.70996: done dumping result, returning 32935 1726853719.71007: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' [02083763-bbaf-84df-441d-000000000120] 32935 1726853719.71018: sending task result for task 02083763-bbaf-84df-441d-000000000120 32935 1726853719.71207: no more pending results, returning what we have 32935 1726853719.71213: in VariableManager get_vars() 32935 1726853719.71264: Calling all_inventory to load vars for managed_node1 32935 1726853719.71268: Calling groups_inventory to load vars for managed_node1 32935 1726853719.71272: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853719.71290: Calling all_plugins_play to load vars for managed_node1 32935 1726853719.71293: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853719.71296: Calling groups_plugins_play to load vars for managed_node1 32935 1726853719.71689: done sending task result for task 02083763-bbaf-84df-441d-000000000120 32935 1726853719.71692: WORKER PROCESS EXITING 32935 1726853719.71713: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853719.71956: done with get_vars() 32935 1726853719.71966: variable 'ansible_search_path' from source: unknown 32935 1726853719.71967: variable 'ansible_search_path' from source: unknown 32935 1726853719.72008: we have included files to process 32935 1726853719.72009: generating all_blocks data 32935 1726853719.72010: done generating all_blocks data 32935 1726853719.72011: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 32935 1726853719.72012: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 32935 1726853719.72015: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 32935 1726853719.72363: done processing included file 32935 1726853719.72365: iterating over new_blocks loaded from include file 32935 1726853719.72367: in VariableManager get_vars() 32935 1726853719.72390: done with get_vars() 32935 1726853719.72392: filtering new block on tags 32935 1726853719.72409: done filtering new block on tags 32935 1726853719.72411: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node1 32935 1726853719.72416: extending task lists for all hosts with included blocks 32935 1726853719.72528: done extending task lists 32935 1726853719.72535: done processing included files 32935 1726853719.72537: results queue empty 32935 1726853719.72537: checking for any_errors_fatal 32935 1726853719.72540: done checking for any_errors_fatal 32935 1726853719.72541: checking for max_fail_percentage 32935 1726853719.72543: done checking for max_fail_percentage 32935 1726853719.72544: checking to see if all hosts have failed and the running result is not ok 32935 1726853719.72544: done checking to see if all hosts have failed 32935 1726853719.72545: getting the remaining hosts for this loop 32935 1726853719.72546: done getting the remaining hosts for this loop 32935 1726853719.72549: getting the next task for host managed_node1 32935 1726853719.72553: done getting next task for host managed_node1 32935 1726853719.72555: ^ task is: TASK: Gather current interface info 32935 1726853719.72562: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853719.72564: getting variables 32935 1726853719.72565: in VariableManager get_vars() 32935 1726853719.72579: Calling all_inventory to load vars for managed_node1 32935 1726853719.72581: Calling groups_inventory to load vars for managed_node1 32935 1726853719.72583: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853719.72602: Calling all_plugins_play to load vars for managed_node1 32935 1726853719.72606: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853719.72610: Calling groups_plugins_play to load vars for managed_node1 32935 1726853719.72811: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853719.73191: done with get_vars() 32935 1726853719.73199: done getting variables 32935 1726853719.73304: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 13:35:19 -0400 (0:00:00.038) 0:00:04.869 ****** 32935 1726853719.73340: entering _queue_task() for managed_node1/command 32935 1726853719.73903: worker is 1 (out of 1 available) 32935 1726853719.73920: exiting _queue_task() for managed_node1/command 32935 1726853719.73929: done queuing things up, now waiting for results queue to drain 32935 1726853719.73931: waiting for pending results... 32935 1726853719.74187: running TaskExecutor() for managed_node1/TASK: Gather current interface info 32935 1726853719.74338: in run() - task 02083763-bbaf-84df-441d-0000000001ff 32935 1726853719.74394: variable 'ansible_search_path' from source: unknown 32935 1726853719.74416: variable 'ansible_search_path' from source: unknown 32935 1726853719.74482: calling self._execute() 32935 1726853719.74695: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853719.74740: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853719.74778: variable 'omit' from source: magic vars 32935 1726853719.75527: variable 'ansible_distribution_major_version' from source: facts 32935 1726853719.75569: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853719.75679: variable 'omit' from source: magic vars 32935 1726853719.75683: variable 'omit' from source: magic vars 32935 1726853719.75686: variable 'omit' from source: magic vars 32935 1726853719.75740: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32935 1726853719.75806: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32935 1726853719.75835: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32935 1726853719.75860: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853719.75880: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853719.75970: variable 'inventory_hostname' from source: host vars for 'managed_node1' 32935 1726853719.75986: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853719.76175: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853719.76179: Set connection var ansible_timeout to 10 32935 1726853719.76187: Set connection var ansible_shell_type to sh 32935 1726853719.76190: Set connection var ansible_pipelining to False 32935 1726853719.76192: Set connection var ansible_connection to ssh 32935 1726853719.76195: Set connection var ansible_shell_executable to /bin/sh 32935 1726853719.76197: Set connection var ansible_module_compression to ZIP_DEFLATED 32935 1726853719.76250: variable 'ansible_shell_executable' from source: unknown 32935 1726853719.76261: variable 'ansible_connection' from source: unknown 32935 1726853719.76290: variable 'ansible_module_compression' from source: unknown 32935 1726853719.76310: variable 'ansible_shell_type' from source: unknown 32935 1726853719.76334: variable 'ansible_shell_executable' from source: unknown 32935 1726853719.76373: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853719.76376: variable 'ansible_pipelining' from source: unknown 32935 1726853719.76378: variable 'ansible_timeout' from source: unknown 32935 1726853719.76390: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853719.76620: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32935 1726853719.76715: variable 'omit' from source: magic vars 32935 1726853719.76718: starting attempt loop 32935 1726853719.76720: running the handler 32935 1726853719.76723: _low_level_execute_command(): starting 32935 1726853719.76725: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32935 1726853719.77767: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853719.77822: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853719.78033: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853719.78082: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853719.79775: stdout chunk (state=3): >>>/root <<< 32935 1726853719.79917: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853719.79928: stdout chunk (state=3): >>><<< 32935 1726853719.79951: stderr chunk (state=3): >>><<< 32935 1726853719.79980: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853719.80088: _low_level_execute_command(): starting 32935 1726853719.80092: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853719.7998798-33212-126024980844088 `" && echo ansible-tmp-1726853719.7998798-33212-126024980844088="` echo /root/.ansible/tmp/ansible-tmp-1726853719.7998798-33212-126024980844088 `" ) && sleep 0' 32935 1726853719.80887: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32935 1726853719.80917: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853719.80983: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853719.81008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 32935 1726853719.81125: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853719.81155: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853719.81192: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853719.81282: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853719.83209: stdout chunk (state=3): >>>ansible-tmp-1726853719.7998798-33212-126024980844088=/root/.ansible/tmp/ansible-tmp-1726853719.7998798-33212-126024980844088 <<< 32935 1726853719.83577: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853719.83582: stdout chunk (state=3): >>><<< 32935 1726853719.83584: stderr chunk (state=3): >>><<< 32935 1726853719.83586: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853719.7998798-33212-126024980844088=/root/.ansible/tmp/ansible-tmp-1726853719.7998798-33212-126024980844088 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853719.83588: variable 'ansible_module_compression' from source: unknown 32935 1726853719.83632: ANSIBALLZ: Using generic lock for ansible.legacy.command 32935 1726853719.83644: ANSIBALLZ: Acquiring lock 32935 1726853719.83650: ANSIBALLZ: Lock acquired: 140683294872048 32935 1726853719.83656: ANSIBALLZ: Creating module 32935 1726853719.96928: ANSIBALLZ: Writing module into payload 32935 1726853719.97023: ANSIBALLZ: Writing module 32935 1726853719.97045: ANSIBALLZ: Renaming module 32935 1726853719.97050: ANSIBALLZ: Done creating module 32935 1726853719.97073: variable 'ansible_facts' from source: unknown 32935 1726853719.97147: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853719.7998798-33212-126024980844088/AnsiballZ_command.py 32935 1726853719.97340: Sending initial data 32935 1726853719.97343: Sent initial data (156 bytes) 32935 1726853719.97931: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32935 1726853719.97945: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853719.97959: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853719.98079: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853719.98104: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853719.98186: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853719.99855: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32935 1726853719.99928: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32935 1726853719.99984: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32935vj31k4ae/tmp51w5ht38 /root/.ansible/tmp/ansible-tmp-1726853719.7998798-33212-126024980844088/AnsiballZ_command.py <<< 32935 1726853719.99987: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853719.7998798-33212-126024980844088/AnsiballZ_command.py" <<< 32935 1726853720.00037: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-32935vj31k4ae/tmp51w5ht38" to remote "/root/.ansible/tmp/ansible-tmp-1726853719.7998798-33212-126024980844088/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853719.7998798-33212-126024980844088/AnsiballZ_command.py" <<< 32935 1726853720.00828: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853720.00906: stderr chunk (state=3): >>><<< 32935 1726853720.00915: stdout chunk (state=3): >>><<< 32935 1726853720.01045: done transferring module to remote 32935 1726853720.01048: _low_level_execute_command(): starting 32935 1726853720.01050: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853719.7998798-33212-126024980844088/ /root/.ansible/tmp/ansible-tmp-1726853719.7998798-33212-126024980844088/AnsiballZ_command.py && sleep 0' 32935 1726853720.01850: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853720.01976: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853720.01994: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853720.02052: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853720.02164: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853720.03960: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853720.04016: stderr chunk (state=3): >>><<< 32935 1726853720.04264: stdout chunk (state=3): >>><<< 32935 1726853720.04268: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853720.04272: _low_level_execute_command(): starting 32935 1726853720.04276: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853719.7998798-33212-126024980844088/AnsiballZ_command.py && sleep 0' 32935 1726853720.04798: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32935 1726853720.04810: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853720.04826: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853720.04846: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32935 1726853720.04867: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 32935 1726853720.04969: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853720.04983: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853720.05000: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853720.05081: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853720.20653: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 13:35:20.201283", "end": "2024-09-20 13:35:20.204603", "delta": "0:00:00.003320", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 32935 1726853720.22141: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 32935 1726853720.22145: stdout chunk (state=3): >>><<< 32935 1726853720.22166: stderr chunk (state=3): >>><<< 32935 1726853720.22195: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 13:35:20.201283", "end": "2024-09-20 13:35:20.204603", "delta": "0:00:00.003320", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 32935 1726853720.22246: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853719.7998798-33212-126024980844088/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32935 1726853720.22360: _low_level_execute_command(): starting 32935 1726853720.22364: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853719.7998798-33212-126024980844088/ > /dev/null 2>&1 && sleep 0' 32935 1726853720.23468: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853720.23475: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 32935 1726853720.23522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853720.23581: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853720.23585: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853720.23587: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853720.24176: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853720.25722: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853720.25769: stderr chunk (state=3): >>><<< 32935 1726853720.25786: stdout chunk (state=3): >>><<< 32935 1726853720.25808: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853720.25815: handler run complete 32935 1726853720.25841: Evaluated conditional (False): False 32935 1726853720.25851: attempt loop complete, returning result 32935 1726853720.25854: _execute() done 32935 1726853720.25857: dumping result to json 32935 1726853720.25861: done dumping result, returning 32935 1726853720.25870: done running TaskExecutor() for managed_node1/TASK: Gather current interface info [02083763-bbaf-84df-441d-0000000001ff] 32935 1726853720.25878: sending task result for task 02083763-bbaf-84df-441d-0000000001ff 32935 1726853720.26076: done sending task result for task 02083763-bbaf-84df-441d-0000000001ff 32935 1726853720.26079: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003320", "end": "2024-09-20 13:35:20.204603", "rc": 0, "start": "2024-09-20 13:35:20.201283" } STDOUT: bonding_masters eth0 lo 32935 1726853720.26149: no more pending results, returning what we have 32935 1726853720.26152: results queue empty 32935 1726853720.26153: checking for any_errors_fatal 32935 1726853720.26154: done checking for any_errors_fatal 32935 1726853720.26154: checking for max_fail_percentage 32935 1726853720.26156: done checking for max_fail_percentage 32935 1726853720.26157: checking to see if all hosts have failed and the running result is not ok 32935 1726853720.26160: done checking to see if all hosts have failed 32935 1726853720.26161: getting the remaining hosts for this loop 32935 1726853720.26163: done getting the remaining hosts for this loop 32935 1726853720.26166: getting the next task for host managed_node1 32935 1726853720.26174: done getting next task for host managed_node1 32935 1726853720.26176: ^ task is: TASK: Set current_interfaces 32935 1726853720.26180: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853720.26183: getting variables 32935 1726853720.26184: in VariableManager get_vars() 32935 1726853720.26218: Calling all_inventory to load vars for managed_node1 32935 1726853720.26220: Calling groups_inventory to load vars for managed_node1 32935 1726853720.26222: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853720.26231: Calling all_plugins_play to load vars for managed_node1 32935 1726853720.26233: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853720.26236: Calling groups_plugins_play to load vars for managed_node1 32935 1726853720.26425: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853720.26605: done with get_vars() 32935 1726853720.26614: done getting variables 32935 1726853720.26672: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 13:35:20 -0400 (0:00:00.533) 0:00:05.402 ****** 32935 1726853720.26698: entering _queue_task() for managed_node1/set_fact 32935 1726853720.27332: worker is 1 (out of 1 available) 32935 1726853720.27344: exiting _queue_task() for managed_node1/set_fact 32935 1726853720.27355: done queuing things up, now waiting for results queue to drain 32935 1726853720.27357: waiting for pending results... 32935 1726853720.27741: running TaskExecutor() for managed_node1/TASK: Set current_interfaces 32935 1726853720.28079: in run() - task 02083763-bbaf-84df-441d-000000000200 32935 1726853720.28187: variable 'ansible_search_path' from source: unknown 32935 1726853720.28191: variable 'ansible_search_path' from source: unknown 32935 1726853720.28195: calling self._execute() 32935 1726853720.28327: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853720.28338: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853720.28352: variable 'omit' from source: magic vars 32935 1726853720.28802: variable 'ansible_distribution_major_version' from source: facts 32935 1726853720.28965: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853720.28978: variable 'omit' from source: magic vars 32935 1726853720.29025: variable 'omit' from source: magic vars 32935 1726853720.29347: variable '_current_interfaces' from source: set_fact 32935 1726853720.29703: variable 'omit' from source: magic vars 32935 1726853720.29746: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32935 1726853720.30178: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32935 1726853720.30182: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32935 1726853720.30185: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853720.30187: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853720.30189: variable 'inventory_hostname' from source: host vars for 'managed_node1' 32935 1726853720.30192: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853720.30194: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853720.30197: Set connection var ansible_timeout to 10 32935 1726853720.30199: Set connection var ansible_shell_type to sh 32935 1726853720.30201: Set connection var ansible_pipelining to False 32935 1726853720.30204: Set connection var ansible_connection to ssh 32935 1726853720.30575: Set connection var ansible_shell_executable to /bin/sh 32935 1726853720.30579: Set connection var ansible_module_compression to ZIP_DEFLATED 32935 1726853720.30581: variable 'ansible_shell_executable' from source: unknown 32935 1726853720.30583: variable 'ansible_connection' from source: unknown 32935 1726853720.30585: variable 'ansible_module_compression' from source: unknown 32935 1726853720.30587: variable 'ansible_shell_type' from source: unknown 32935 1726853720.30589: variable 'ansible_shell_executable' from source: unknown 32935 1726853720.30591: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853720.30592: variable 'ansible_pipelining' from source: unknown 32935 1726853720.30595: variable 'ansible_timeout' from source: unknown 32935 1726853720.30596: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853720.30600: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32935 1726853720.30603: variable 'omit' from source: magic vars 32935 1726853720.30782: starting attempt loop 32935 1726853720.30791: running the handler 32935 1726853720.30806: handler run complete 32935 1726853720.30819: attempt loop complete, returning result 32935 1726853720.30827: _execute() done 32935 1726853720.30830: dumping result to json 32935 1726853720.30833: done dumping result, returning 32935 1726853720.30840: done running TaskExecutor() for managed_node1/TASK: Set current_interfaces [02083763-bbaf-84df-441d-000000000200] 32935 1726853720.30848: sending task result for task 02083763-bbaf-84df-441d-000000000200 ok: [managed_node1] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 32935 1726853720.31002: no more pending results, returning what we have 32935 1726853720.31004: results queue empty 32935 1726853720.31005: checking for any_errors_fatal 32935 1726853720.31011: done checking for any_errors_fatal 32935 1726853720.31012: checking for max_fail_percentage 32935 1726853720.31014: done checking for max_fail_percentage 32935 1726853720.31014: checking to see if all hosts have failed and the running result is not ok 32935 1726853720.31016: done checking to see if all hosts have failed 32935 1726853720.31016: getting the remaining hosts for this loop 32935 1726853720.31017: done getting the remaining hosts for this loop 32935 1726853720.31021: getting the next task for host managed_node1 32935 1726853720.31029: done getting next task for host managed_node1 32935 1726853720.31031: ^ task is: TASK: Show current_interfaces 32935 1726853720.31033: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853720.31038: getting variables 32935 1726853720.31040: in VariableManager get_vars() 32935 1726853720.31081: Calling all_inventory to load vars for managed_node1 32935 1726853720.31084: Calling groups_inventory to load vars for managed_node1 32935 1726853720.31087: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853720.31101: Calling all_plugins_play to load vars for managed_node1 32935 1726853720.31104: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853720.31108: Calling groups_plugins_play to load vars for managed_node1 32935 1726853720.31687: done sending task result for task 02083763-bbaf-84df-441d-000000000200 32935 1726853720.31691: WORKER PROCESS EXITING 32935 1726853720.31714: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853720.32265: done with get_vars() 32935 1726853720.32279: done getting variables 32935 1726853720.32374: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 13:35:20 -0400 (0:00:00.057) 0:00:05.459 ****** 32935 1726853720.32403: entering _queue_task() for managed_node1/debug 32935 1726853720.32404: Creating lock for debug 32935 1726853720.32887: worker is 1 (out of 1 available) 32935 1726853720.32899: exiting _queue_task() for managed_node1/debug 32935 1726853720.32910: done queuing things up, now waiting for results queue to drain 32935 1726853720.32912: waiting for pending results... 32935 1726853720.33155: running TaskExecutor() for managed_node1/TASK: Show current_interfaces 32935 1726853720.33472: in run() - task 02083763-bbaf-84df-441d-000000000121 32935 1726853720.33492: variable 'ansible_search_path' from source: unknown 32935 1726853720.33499: variable 'ansible_search_path' from source: unknown 32935 1726853720.33534: calling self._execute() 32935 1726853720.33809: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853720.33821: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853720.33878: variable 'omit' from source: magic vars 32935 1726853720.34400: variable 'ansible_distribution_major_version' from source: facts 32935 1726853720.34542: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853720.34553: variable 'omit' from source: magic vars 32935 1726853720.34596: variable 'omit' from source: magic vars 32935 1726853720.34734: variable 'current_interfaces' from source: set_fact 32935 1726853720.34888: variable 'omit' from source: magic vars 32935 1726853720.34931: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32935 1726853720.35111: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32935 1726853720.35135: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32935 1726853720.35157: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853720.35180: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853720.35212: variable 'inventory_hostname' from source: host vars for 'managed_node1' 32935 1726853720.35290: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853720.35297: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853720.35386: Set connection var ansible_timeout to 10 32935 1726853720.35511: Set connection var ansible_shell_type to sh 32935 1726853720.35523: Set connection var ansible_pipelining to False 32935 1726853720.35529: Set connection var ansible_connection to ssh 32935 1726853720.35539: Set connection var ansible_shell_executable to /bin/sh 32935 1726853720.35549: Set connection var ansible_module_compression to ZIP_DEFLATED 32935 1726853720.35583: variable 'ansible_shell_executable' from source: unknown 32935 1726853720.35615: variable 'ansible_connection' from source: unknown 32935 1726853720.35623: variable 'ansible_module_compression' from source: unknown 32935 1726853720.35629: variable 'ansible_shell_type' from source: unknown 32935 1726853720.35636: variable 'ansible_shell_executable' from source: unknown 32935 1726853720.35828: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853720.35831: variable 'ansible_pipelining' from source: unknown 32935 1726853720.35833: variable 'ansible_timeout' from source: unknown 32935 1726853720.35836: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853720.36002: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32935 1726853720.36057: variable 'omit' from source: magic vars 32935 1726853720.36073: starting attempt loop 32935 1726853720.36081: running the handler 32935 1726853720.36266: handler run complete 32935 1726853720.36269: attempt loop complete, returning result 32935 1726853720.36273: _execute() done 32935 1726853720.36275: dumping result to json 32935 1726853720.36281: done dumping result, returning 32935 1726853720.36293: done running TaskExecutor() for managed_node1/TASK: Show current_interfaces [02083763-bbaf-84df-441d-000000000121] 32935 1726853720.36303: sending task result for task 02083763-bbaf-84df-441d-000000000121 ok: [managed_node1] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 32935 1726853720.36450: no more pending results, returning what we have 32935 1726853720.36453: results queue empty 32935 1726853720.36454: checking for any_errors_fatal 32935 1726853720.36462: done checking for any_errors_fatal 32935 1726853720.36463: checking for max_fail_percentage 32935 1726853720.36465: done checking for max_fail_percentage 32935 1726853720.36466: checking to see if all hosts have failed and the running result is not ok 32935 1726853720.36467: done checking to see if all hosts have failed 32935 1726853720.36468: getting the remaining hosts for this loop 32935 1726853720.36469: done getting the remaining hosts for this loop 32935 1726853720.36475: getting the next task for host managed_node1 32935 1726853720.36485: done getting next task for host managed_node1 32935 1726853720.36488: ^ task is: TASK: Include the task 'manage_test_interface.yml' 32935 1726853720.36490: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853720.36494: getting variables 32935 1726853720.36496: in VariableManager get_vars() 32935 1726853720.36539: Calling all_inventory to load vars for managed_node1 32935 1726853720.36542: Calling groups_inventory to load vars for managed_node1 32935 1726853720.36545: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853720.36557: Calling all_plugins_play to load vars for managed_node1 32935 1726853720.36563: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853720.36567: Calling groups_plugins_play to load vars for managed_node1 32935 1726853720.37053: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853720.37631: done sending task result for task 02083763-bbaf-84df-441d-000000000121 32935 1726853720.37635: WORKER PROCESS EXITING 32935 1726853720.37648: done with get_vars() 32935 1726853720.37661: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_vlan_mtu.yml:12 Friday 20 September 2024 13:35:20 -0400 (0:00:00.053) 0:00:05.513 ****** 32935 1726853720.37751: entering _queue_task() for managed_node1/include_tasks 32935 1726853720.38012: worker is 1 (out of 1 available) 32935 1726853720.38026: exiting _queue_task() for managed_node1/include_tasks 32935 1726853720.38038: done queuing things up, now waiting for results queue to drain 32935 1726853720.38040: waiting for pending results... 32935 1726853720.38270: running TaskExecutor() for managed_node1/TASK: Include the task 'manage_test_interface.yml' 32935 1726853720.38352: in run() - task 02083763-bbaf-84df-441d-00000000000c 32935 1726853720.38376: variable 'ansible_search_path' from source: unknown 32935 1726853720.38422: calling self._execute() 32935 1726853720.38515: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853720.38526: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853720.38542: variable 'omit' from source: magic vars 32935 1726853720.38962: variable 'ansible_distribution_major_version' from source: facts 32935 1726853720.38981: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853720.38991: _execute() done 32935 1726853720.38999: dumping result to json 32935 1726853720.39006: done dumping result, returning 32935 1726853720.39016: done running TaskExecutor() for managed_node1/TASK: Include the task 'manage_test_interface.yml' [02083763-bbaf-84df-441d-00000000000c] 32935 1726853720.39024: sending task result for task 02083763-bbaf-84df-441d-00000000000c 32935 1726853720.39127: done sending task result for task 02083763-bbaf-84df-441d-00000000000c 32935 1726853720.39133: WORKER PROCESS EXITING 32935 1726853720.39173: no more pending results, returning what we have 32935 1726853720.39178: in VariableManager get_vars() 32935 1726853720.39222: Calling all_inventory to load vars for managed_node1 32935 1726853720.39225: Calling groups_inventory to load vars for managed_node1 32935 1726853720.39227: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853720.39240: Calling all_plugins_play to load vars for managed_node1 32935 1726853720.39243: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853720.39246: Calling groups_plugins_play to load vars for managed_node1 32935 1726853720.39665: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853720.39837: done with get_vars() 32935 1726853720.39844: variable 'ansible_search_path' from source: unknown 32935 1726853720.39856: we have included files to process 32935 1726853720.39860: generating all_blocks data 32935 1726853720.39863: done generating all_blocks data 32935 1726853720.39866: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 32935 1726853720.39867: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 32935 1726853720.39870: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 32935 1726853720.40356: in VariableManager get_vars() 32935 1726853720.40381: done with get_vars() 32935 1726853720.40573: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 32935 1726853720.41100: done processing included file 32935 1726853720.41102: iterating over new_blocks loaded from include file 32935 1726853720.41103: in VariableManager get_vars() 32935 1726853720.41120: done with get_vars() 32935 1726853720.41122: filtering new block on tags 32935 1726853720.41152: done filtering new block on tags 32935 1726853720.41155: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed_node1 32935 1726853720.41162: extending task lists for all hosts with included blocks 32935 1726853720.43037: done extending task lists 32935 1726853720.43039: done processing included files 32935 1726853720.43040: results queue empty 32935 1726853720.43041: checking for any_errors_fatal 32935 1726853720.43043: done checking for any_errors_fatal 32935 1726853720.43044: checking for max_fail_percentage 32935 1726853720.43045: done checking for max_fail_percentage 32935 1726853720.43046: checking to see if all hosts have failed and the running result is not ok 32935 1726853720.43047: done checking to see if all hosts have failed 32935 1726853720.43047: getting the remaining hosts for this loop 32935 1726853720.43048: done getting the remaining hosts for this loop 32935 1726853720.43051: getting the next task for host managed_node1 32935 1726853720.43055: done getting next task for host managed_node1 32935 1726853720.43059: ^ task is: TASK: Ensure state in ["present", "absent"] 32935 1726853720.43062: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853720.43064: getting variables 32935 1726853720.43065: in VariableManager get_vars() 32935 1726853720.43082: Calling all_inventory to load vars for managed_node1 32935 1726853720.43084: Calling groups_inventory to load vars for managed_node1 32935 1726853720.43086: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853720.43092: Calling all_plugins_play to load vars for managed_node1 32935 1726853720.43095: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853720.43097: Calling groups_plugins_play to load vars for managed_node1 32935 1726853720.43252: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853720.43617: done with get_vars() 32935 1726853720.43626: done getting variables 32935 1726853720.43695: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Friday 20 September 2024 13:35:20 -0400 (0:00:00.059) 0:00:05.572 ****** 32935 1726853720.43719: entering _queue_task() for managed_node1/fail 32935 1726853720.43721: Creating lock for fail 32935 1726853720.44045: worker is 1 (out of 1 available) 32935 1726853720.44060: exiting _queue_task() for managed_node1/fail 32935 1726853720.44073: done queuing things up, now waiting for results queue to drain 32935 1726853720.44075: waiting for pending results... 32935 1726853720.44490: running TaskExecutor() for managed_node1/TASK: Ensure state in ["present", "absent"] 32935 1726853720.44494: in run() - task 02083763-bbaf-84df-441d-00000000021b 32935 1726853720.44496: variable 'ansible_search_path' from source: unknown 32935 1726853720.44498: variable 'ansible_search_path' from source: unknown 32935 1726853720.44500: calling self._execute() 32935 1726853720.44603: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853720.44618: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853720.44636: variable 'omit' from source: magic vars 32935 1726853720.45018: variable 'ansible_distribution_major_version' from source: facts 32935 1726853720.45036: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853720.45176: variable 'state' from source: include params 32935 1726853720.45188: Evaluated conditional (state not in ["present", "absent"]): False 32935 1726853720.45195: when evaluation is False, skipping this task 32935 1726853720.45202: _execute() done 32935 1726853720.45208: dumping result to json 32935 1726853720.45214: done dumping result, returning 32935 1726853720.45222: done running TaskExecutor() for managed_node1/TASK: Ensure state in ["present", "absent"] [02083763-bbaf-84df-441d-00000000021b] 32935 1726853720.45230: sending task result for task 02083763-bbaf-84df-441d-00000000021b skipping: [managed_node1] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 32935 1726853720.45417: no more pending results, returning what we have 32935 1726853720.45421: results queue empty 32935 1726853720.45422: checking for any_errors_fatal 32935 1726853720.45423: done checking for any_errors_fatal 32935 1726853720.45424: checking for max_fail_percentage 32935 1726853720.45425: done checking for max_fail_percentage 32935 1726853720.45426: checking to see if all hosts have failed and the running result is not ok 32935 1726853720.45427: done checking to see if all hosts have failed 32935 1726853720.45427: getting the remaining hosts for this loop 32935 1726853720.45429: done getting the remaining hosts for this loop 32935 1726853720.45432: getting the next task for host managed_node1 32935 1726853720.45441: done getting next task for host managed_node1 32935 1726853720.45443: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 32935 1726853720.45446: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853720.45450: getting variables 32935 1726853720.45452: in VariableManager get_vars() 32935 1726853720.45496: Calling all_inventory to load vars for managed_node1 32935 1726853720.45500: Calling groups_inventory to load vars for managed_node1 32935 1726853720.45502: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853720.45514: Calling all_plugins_play to load vars for managed_node1 32935 1726853720.45517: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853720.45520: Calling groups_plugins_play to load vars for managed_node1 32935 1726853720.45925: done sending task result for task 02083763-bbaf-84df-441d-00000000021b 32935 1726853720.45929: WORKER PROCESS EXITING 32935 1726853720.45954: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853720.46181: done with get_vars() 32935 1726853720.46193: done getting variables 32935 1726853720.46256: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Friday 20 September 2024 13:35:20 -0400 (0:00:00.025) 0:00:05.598 ****** 32935 1726853720.46290: entering _queue_task() for managed_node1/fail 32935 1726853720.46557: worker is 1 (out of 1 available) 32935 1726853720.46778: exiting _queue_task() for managed_node1/fail 32935 1726853720.46789: done queuing things up, now waiting for results queue to drain 32935 1726853720.46791: waiting for pending results... 32935 1726853720.46891: running TaskExecutor() for managed_node1/TASK: Ensure type in ["dummy", "tap", "veth"] 32935 1726853720.47010: in run() - task 02083763-bbaf-84df-441d-00000000021c 32935 1726853720.47031: variable 'ansible_search_path' from source: unknown 32935 1726853720.47041: variable 'ansible_search_path' from source: unknown 32935 1726853720.47086: calling self._execute() 32935 1726853720.47183: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853720.47194: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853720.47208: variable 'omit' from source: magic vars 32935 1726853720.47597: variable 'ansible_distribution_major_version' from source: facts 32935 1726853720.47614: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853720.47762: variable 'type' from source: play vars 32935 1726853720.47780: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 32935 1726853720.47788: when evaluation is False, skipping this task 32935 1726853720.47795: _execute() done 32935 1726853720.47801: dumping result to json 32935 1726853720.47808: done dumping result, returning 32935 1726853720.47818: done running TaskExecutor() for managed_node1/TASK: Ensure type in ["dummy", "tap", "veth"] [02083763-bbaf-84df-441d-00000000021c] 32935 1726853720.47827: sending task result for task 02083763-bbaf-84df-441d-00000000021c skipping: [managed_node1] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 32935 1726853720.48064: no more pending results, returning what we have 32935 1726853720.48068: results queue empty 32935 1726853720.48069: checking for any_errors_fatal 32935 1726853720.48077: done checking for any_errors_fatal 32935 1726853720.48078: checking for max_fail_percentage 32935 1726853720.48080: done checking for max_fail_percentage 32935 1726853720.48081: checking to see if all hosts have failed and the running result is not ok 32935 1726853720.48082: done checking to see if all hosts have failed 32935 1726853720.48083: getting the remaining hosts for this loop 32935 1726853720.48084: done getting the remaining hosts for this loop 32935 1726853720.48088: getting the next task for host managed_node1 32935 1726853720.48097: done getting next task for host managed_node1 32935 1726853720.48100: ^ task is: TASK: Include the task 'show_interfaces.yml' 32935 1726853720.48103: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853720.48108: getting variables 32935 1726853720.48109: in VariableManager get_vars() 32935 1726853720.48153: Calling all_inventory to load vars for managed_node1 32935 1726853720.48156: Calling groups_inventory to load vars for managed_node1 32935 1726853720.48162: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853720.48348: Calling all_plugins_play to load vars for managed_node1 32935 1726853720.48352: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853720.48361: done sending task result for task 02083763-bbaf-84df-441d-00000000021c 32935 1726853720.48364: WORKER PROCESS EXITING 32935 1726853720.48369: Calling groups_plugins_play to load vars for managed_node1 32935 1726853720.48586: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853720.48773: done with get_vars() 32935 1726853720.48783: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Friday 20 September 2024 13:35:20 -0400 (0:00:00.025) 0:00:05.624 ****** 32935 1726853720.48875: entering _queue_task() for managed_node1/include_tasks 32935 1726853720.49134: worker is 1 (out of 1 available) 32935 1726853720.49146: exiting _queue_task() for managed_node1/include_tasks 32935 1726853720.49161: done queuing things up, now waiting for results queue to drain 32935 1726853720.49163: waiting for pending results... 32935 1726853720.49416: running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' 32935 1726853720.49526: in run() - task 02083763-bbaf-84df-441d-00000000021d 32935 1726853720.49544: variable 'ansible_search_path' from source: unknown 32935 1726853720.49552: variable 'ansible_search_path' from source: unknown 32935 1726853720.49605: calling self._execute() 32935 1726853720.49695: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853720.49712: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853720.49727: variable 'omit' from source: magic vars 32935 1726853720.50104: variable 'ansible_distribution_major_version' from source: facts 32935 1726853720.50121: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853720.50131: _execute() done 32935 1726853720.50142: dumping result to json 32935 1726853720.50176: done dumping result, returning 32935 1726853720.50179: done running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' [02083763-bbaf-84df-441d-00000000021d] 32935 1726853720.50182: sending task result for task 02083763-bbaf-84df-441d-00000000021d 32935 1726853720.50494: no more pending results, returning what we have 32935 1726853720.50498: in VariableManager get_vars() 32935 1726853720.50537: Calling all_inventory to load vars for managed_node1 32935 1726853720.50540: Calling groups_inventory to load vars for managed_node1 32935 1726853720.50542: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853720.50553: Calling all_plugins_play to load vars for managed_node1 32935 1726853720.50556: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853720.50562: Calling groups_plugins_play to load vars for managed_node1 32935 1726853720.50780: done sending task result for task 02083763-bbaf-84df-441d-00000000021d 32935 1726853720.50784: WORKER PROCESS EXITING 32935 1726853720.50807: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853720.51002: done with get_vars() 32935 1726853720.51009: variable 'ansible_search_path' from source: unknown 32935 1726853720.51010: variable 'ansible_search_path' from source: unknown 32935 1726853720.51044: we have included files to process 32935 1726853720.51045: generating all_blocks data 32935 1726853720.51046: done generating all_blocks data 32935 1726853720.51050: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 32935 1726853720.51051: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 32935 1726853720.51053: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 32935 1726853720.51176: in VariableManager get_vars() 32935 1726853720.51201: done with get_vars() 32935 1726853720.51307: done processing included file 32935 1726853720.51309: iterating over new_blocks loaded from include file 32935 1726853720.51310: in VariableManager get_vars() 32935 1726853720.51326: done with get_vars() 32935 1726853720.51328: filtering new block on tags 32935 1726853720.51343: done filtering new block on tags 32935 1726853720.51346: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node1 32935 1726853720.51350: extending task lists for all hosts with included blocks 32935 1726853720.51792: done extending task lists 32935 1726853720.51794: done processing included files 32935 1726853720.51794: results queue empty 32935 1726853720.51795: checking for any_errors_fatal 32935 1726853720.51798: done checking for any_errors_fatal 32935 1726853720.51799: checking for max_fail_percentage 32935 1726853720.51800: done checking for max_fail_percentage 32935 1726853720.51800: checking to see if all hosts have failed and the running result is not ok 32935 1726853720.51801: done checking to see if all hosts have failed 32935 1726853720.51802: getting the remaining hosts for this loop 32935 1726853720.51803: done getting the remaining hosts for this loop 32935 1726853720.51805: getting the next task for host managed_node1 32935 1726853720.51809: done getting next task for host managed_node1 32935 1726853720.51811: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 32935 1726853720.51814: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853720.51816: getting variables 32935 1726853720.51816: in VariableManager get_vars() 32935 1726853720.51828: Calling all_inventory to load vars for managed_node1 32935 1726853720.51830: Calling groups_inventory to load vars for managed_node1 32935 1726853720.51832: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853720.51838: Calling all_plugins_play to load vars for managed_node1 32935 1726853720.51840: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853720.51843: Calling groups_plugins_play to load vars for managed_node1 32935 1726853720.51976: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853720.52149: done with get_vars() 32935 1726853720.52160: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 13:35:20 -0400 (0:00:00.033) 0:00:05.657 ****** 32935 1726853720.52233: entering _queue_task() for managed_node1/include_tasks 32935 1726853720.52529: worker is 1 (out of 1 available) 32935 1726853720.52541: exiting _queue_task() for managed_node1/include_tasks 32935 1726853720.52554: done queuing things up, now waiting for results queue to drain 32935 1726853720.52555: waiting for pending results... 32935 1726853720.52835: running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' 32935 1726853720.52954: in run() - task 02083763-bbaf-84df-441d-000000000314 32935 1726853720.52980: variable 'ansible_search_path' from source: unknown 32935 1726853720.52989: variable 'ansible_search_path' from source: unknown 32935 1726853720.53032: calling self._execute() 32935 1726853720.53123: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853720.53134: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853720.53148: variable 'omit' from source: magic vars 32935 1726853720.53574: variable 'ansible_distribution_major_version' from source: facts 32935 1726853720.53591: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853720.53604: _execute() done 32935 1726853720.53612: dumping result to json 32935 1726853720.53621: done dumping result, returning 32935 1726853720.53633: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' [02083763-bbaf-84df-441d-000000000314] 32935 1726853720.53643: sending task result for task 02083763-bbaf-84df-441d-000000000314 32935 1726853720.53800: no more pending results, returning what we have 32935 1726853720.53809: in VariableManager get_vars() 32935 1726853720.53857: Calling all_inventory to load vars for managed_node1 32935 1726853720.53863: Calling groups_inventory to load vars for managed_node1 32935 1726853720.53866: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853720.53883: Calling all_plugins_play to load vars for managed_node1 32935 1726853720.53886: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853720.53889: Calling groups_plugins_play to load vars for managed_node1 32935 1726853720.54494: done sending task result for task 02083763-bbaf-84df-441d-000000000314 32935 1726853720.54497: WORKER PROCESS EXITING 32935 1726853720.54520: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853720.54827: done with get_vars() 32935 1726853720.54835: variable 'ansible_search_path' from source: unknown 32935 1726853720.54836: variable 'ansible_search_path' from source: unknown 32935 1726853720.55103: we have included files to process 32935 1726853720.55107: generating all_blocks data 32935 1726853720.55109: done generating all_blocks data 32935 1726853720.55110: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 32935 1726853720.55111: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 32935 1726853720.55117: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 32935 1726853720.55567: done processing included file 32935 1726853720.55569: iterating over new_blocks loaded from include file 32935 1726853720.55572: in VariableManager get_vars() 32935 1726853720.55592: done with get_vars() 32935 1726853720.55595: filtering new block on tags 32935 1726853720.55612: done filtering new block on tags 32935 1726853720.55614: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node1 32935 1726853720.55619: extending task lists for all hosts with included blocks 32935 1726853720.55909: done extending task lists 32935 1726853720.55911: done processing included files 32935 1726853720.55912: results queue empty 32935 1726853720.55912: checking for any_errors_fatal 32935 1726853720.55916: done checking for any_errors_fatal 32935 1726853720.55917: checking for max_fail_percentage 32935 1726853720.55918: done checking for max_fail_percentage 32935 1726853720.55919: checking to see if all hosts have failed and the running result is not ok 32935 1726853720.55919: done checking to see if all hosts have failed 32935 1726853720.55920: getting the remaining hosts for this loop 32935 1726853720.55921: done getting the remaining hosts for this loop 32935 1726853720.55924: getting the next task for host managed_node1 32935 1726853720.55929: done getting next task for host managed_node1 32935 1726853720.55931: ^ task is: TASK: Gather current interface info 32935 1726853720.55934: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853720.55936: getting variables 32935 1726853720.55937: in VariableManager get_vars() 32935 1726853720.55951: Calling all_inventory to load vars for managed_node1 32935 1726853720.55954: Calling groups_inventory to load vars for managed_node1 32935 1726853720.55955: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853720.56135: Calling all_plugins_play to load vars for managed_node1 32935 1726853720.56138: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853720.56141: Calling groups_plugins_play to load vars for managed_node1 32935 1726853720.56352: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853720.56540: done with get_vars() 32935 1726853720.56549: done getting variables 32935 1726853720.56596: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 13:35:20 -0400 (0:00:00.043) 0:00:05.701 ****** 32935 1726853720.56627: entering _queue_task() for managed_node1/command 32935 1726853720.56925: worker is 1 (out of 1 available) 32935 1726853720.56938: exiting _queue_task() for managed_node1/command 32935 1726853720.56950: done queuing things up, now waiting for results queue to drain 32935 1726853720.56952: waiting for pending results... 32935 1726853720.57204: running TaskExecutor() for managed_node1/TASK: Gather current interface info 32935 1726853720.57326: in run() - task 02083763-bbaf-84df-441d-00000000034b 32935 1726853720.57348: variable 'ansible_search_path' from source: unknown 32935 1726853720.57357: variable 'ansible_search_path' from source: unknown 32935 1726853720.57403: calling self._execute() 32935 1726853720.57512: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853720.57629: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853720.57633: variable 'omit' from source: magic vars 32935 1726853720.58309: variable 'ansible_distribution_major_version' from source: facts 32935 1726853720.58327: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853720.58338: variable 'omit' from source: magic vars 32935 1726853720.58400: variable 'omit' from source: magic vars 32935 1726853720.58433: variable 'omit' from source: magic vars 32935 1726853720.58479: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32935 1726853720.58635: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32935 1726853720.58638: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32935 1726853720.58640: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853720.58650: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853720.58687: variable 'inventory_hostname' from source: host vars for 'managed_node1' 32935 1726853720.58694: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853720.58700: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853720.58793: Set connection var ansible_timeout to 10 32935 1726853720.58806: Set connection var ansible_shell_type to sh 32935 1726853720.58985: Set connection var ansible_pipelining to False 32935 1726853720.58988: Set connection var ansible_connection to ssh 32935 1726853720.58990: Set connection var ansible_shell_executable to /bin/sh 32935 1726853720.58993: Set connection var ansible_module_compression to ZIP_DEFLATED 32935 1726853720.58995: variable 'ansible_shell_executable' from source: unknown 32935 1726853720.58997: variable 'ansible_connection' from source: unknown 32935 1726853720.58999: variable 'ansible_module_compression' from source: unknown 32935 1726853720.59001: variable 'ansible_shell_type' from source: unknown 32935 1726853720.59002: variable 'ansible_shell_executable' from source: unknown 32935 1726853720.59004: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853720.59006: variable 'ansible_pipelining' from source: unknown 32935 1726853720.59008: variable 'ansible_timeout' from source: unknown 32935 1726853720.59010: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853720.59124: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32935 1726853720.59141: variable 'omit' from source: magic vars 32935 1726853720.59157: starting attempt loop 32935 1726853720.59166: running the handler 32935 1726853720.59187: _low_level_execute_command(): starting 32935 1726853720.59203: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32935 1726853720.60050: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32935 1726853720.60130: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853720.60191: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853720.60261: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853720.60295: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853720.60335: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853720.60406: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853720.62109: stdout chunk (state=3): >>>/root <<< 32935 1726853720.62377: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853720.62381: stdout chunk (state=3): >>><<< 32935 1726853720.62384: stderr chunk (state=3): >>><<< 32935 1726853720.62388: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853720.62670: _low_level_execute_command(): starting 32935 1726853720.62678: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853720.6236923-33253-131791435951447 `" && echo ansible-tmp-1726853720.6236923-33253-131791435951447="` echo /root/.ansible/tmp/ansible-tmp-1726853720.6236923-33253-131791435951447 `" ) && sleep 0' 32935 1726853720.63193: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853720.63197: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853720.63200: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853720.63211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853720.63284: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853720.63552: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853720.65409: stdout chunk (state=3): >>>ansible-tmp-1726853720.6236923-33253-131791435951447=/root/.ansible/tmp/ansible-tmp-1726853720.6236923-33253-131791435951447 <<< 32935 1726853720.65510: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853720.65544: stderr chunk (state=3): >>><<< 32935 1726853720.65592: stdout chunk (state=3): >>><<< 32935 1726853720.65829: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853720.6236923-33253-131791435951447=/root/.ansible/tmp/ansible-tmp-1726853720.6236923-33253-131791435951447 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853720.65832: variable 'ansible_module_compression' from source: unknown 32935 1726853720.65836: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-32935vj31k4ae/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 32935 1726853720.66102: variable 'ansible_facts' from source: unknown 32935 1726853720.66277: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853720.6236923-33253-131791435951447/AnsiballZ_command.py 32935 1726853720.66560: Sending initial data 32935 1726853720.66564: Sent initial data (156 bytes) 32935 1726853720.67557: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853720.67637: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 32935 1726853720.67648: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853720.67693: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853720.67704: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853720.67854: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853720.67906: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853720.69469: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32935 1726853720.69505: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32935 1726853720.69553: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32935vj31k4ae/tmpw_xsuu59 /root/.ansible/tmp/ansible-tmp-1726853720.6236923-33253-131791435951447/AnsiballZ_command.py <<< 32935 1726853720.69560: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853720.6236923-33253-131791435951447/AnsiballZ_command.py" <<< 32935 1726853720.69593: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-32935vj31k4ae/tmpw_xsuu59" to remote "/root/.ansible/tmp/ansible-tmp-1726853720.6236923-33253-131791435951447/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853720.6236923-33253-131791435951447/AnsiballZ_command.py" <<< 32935 1726853720.70898: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853720.70956: stderr chunk (state=3): >>><<< 32935 1726853720.70968: stdout chunk (state=3): >>><<< 32935 1726853720.71278: done transferring module to remote 32935 1726853720.71281: _low_level_execute_command(): starting 32935 1726853720.71283: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853720.6236923-33253-131791435951447/ /root/.ansible/tmp/ansible-tmp-1726853720.6236923-33253-131791435951447/AnsiballZ_command.py && sleep 0' 32935 1726853720.72188: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853720.72192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853720.72194: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 32935 1726853720.72396: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853720.72541: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853720.72585: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853720.74478: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853720.74481: stderr chunk (state=3): >>><<< 32935 1726853720.74483: stdout chunk (state=3): >>><<< 32935 1726853720.74565: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853720.74568: _low_level_execute_command(): starting 32935 1726853720.74573: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853720.6236923-33253-131791435951447/AnsiballZ_command.py && sleep 0' 32935 1726853720.75850: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32935 1726853720.75869: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853720.75983: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853720.76006: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853720.76019: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853720.76104: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853720.91514: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 13:35:20.910964", "end": "2024-09-20 13:35:20.914245", "delta": "0:00:00.003281", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 32935 1726853720.93003: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 32935 1726853720.93028: stderr chunk (state=3): >>><<< 32935 1726853720.93031: stdout chunk (state=3): >>><<< 32935 1726853720.93047: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 13:35:20.910964", "end": "2024-09-20 13:35:20.914245", "delta": "0:00:00.003281", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 32935 1726853720.93082: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853720.6236923-33253-131791435951447/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32935 1726853720.93090: _low_level_execute_command(): starting 32935 1726853720.93095: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853720.6236923-33253-131791435951447/ > /dev/null 2>&1 && sleep 0' 32935 1726853720.93538: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853720.93541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 32935 1726853720.93544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853720.93546: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853720.93548: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853720.93605: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853720.93615: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853720.93617: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853720.93655: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853720.95455: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853720.95484: stderr chunk (state=3): >>><<< 32935 1726853720.95487: stdout chunk (state=3): >>><<< 32935 1726853720.95499: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853720.95505: handler run complete 32935 1726853720.95523: Evaluated conditional (False): False 32935 1726853720.95531: attempt loop complete, returning result 32935 1726853720.95535: _execute() done 32935 1726853720.95538: dumping result to json 32935 1726853720.95540: done dumping result, returning 32935 1726853720.95550: done running TaskExecutor() for managed_node1/TASK: Gather current interface info [02083763-bbaf-84df-441d-00000000034b] 32935 1726853720.95552: sending task result for task 02083763-bbaf-84df-441d-00000000034b 32935 1726853720.95647: done sending task result for task 02083763-bbaf-84df-441d-00000000034b 32935 1726853720.95652: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003281", "end": "2024-09-20 13:35:20.914245", "rc": 0, "start": "2024-09-20 13:35:20.910964" } STDOUT: bonding_masters eth0 lo 32935 1726853720.95797: no more pending results, returning what we have 32935 1726853720.95800: results queue empty 32935 1726853720.95801: checking for any_errors_fatal 32935 1726853720.95802: done checking for any_errors_fatal 32935 1726853720.95803: checking for max_fail_percentage 32935 1726853720.95804: done checking for max_fail_percentage 32935 1726853720.95805: checking to see if all hosts have failed and the running result is not ok 32935 1726853720.95806: done checking to see if all hosts have failed 32935 1726853720.95807: getting the remaining hosts for this loop 32935 1726853720.95808: done getting the remaining hosts for this loop 32935 1726853720.95810: getting the next task for host managed_node1 32935 1726853720.95816: done getting next task for host managed_node1 32935 1726853720.95818: ^ task is: TASK: Set current_interfaces 32935 1726853720.95822: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853720.95825: getting variables 32935 1726853720.95826: in VariableManager get_vars() 32935 1726853720.95853: Calling all_inventory to load vars for managed_node1 32935 1726853720.95855: Calling groups_inventory to load vars for managed_node1 32935 1726853720.95857: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853720.95868: Calling all_plugins_play to load vars for managed_node1 32935 1726853720.95870: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853720.95875: Calling groups_plugins_play to load vars for managed_node1 32935 1726853720.95996: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853720.96112: done with get_vars() 32935 1726853720.96120: done getting variables 32935 1726853720.96162: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 13:35:20 -0400 (0:00:00.395) 0:00:06.097 ****** 32935 1726853720.96185: entering _queue_task() for managed_node1/set_fact 32935 1726853720.96391: worker is 1 (out of 1 available) 32935 1726853720.96402: exiting _queue_task() for managed_node1/set_fact 32935 1726853720.96415: done queuing things up, now waiting for results queue to drain 32935 1726853720.96417: waiting for pending results... 32935 1726853720.96566: running TaskExecutor() for managed_node1/TASK: Set current_interfaces 32935 1726853720.96648: in run() - task 02083763-bbaf-84df-441d-00000000034c 32935 1726853720.96656: variable 'ansible_search_path' from source: unknown 32935 1726853720.96663: variable 'ansible_search_path' from source: unknown 32935 1726853720.96710: calling self._execute() 32935 1726853720.96765: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853720.96768: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853720.96780: variable 'omit' from source: magic vars 32935 1726853720.97034: variable 'ansible_distribution_major_version' from source: facts 32935 1726853720.97045: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853720.97050: variable 'omit' from source: magic vars 32935 1726853720.97089: variable 'omit' from source: magic vars 32935 1726853720.97163: variable '_current_interfaces' from source: set_fact 32935 1726853720.97208: variable 'omit' from source: magic vars 32935 1726853720.97238: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32935 1726853720.97267: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32935 1726853720.97284: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32935 1726853720.97303: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853720.97307: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853720.97330: variable 'inventory_hostname' from source: host vars for 'managed_node1' 32935 1726853720.97333: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853720.97336: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853720.97405: Set connection var ansible_timeout to 10 32935 1726853720.97408: Set connection var ansible_shell_type to sh 32935 1726853720.97416: Set connection var ansible_pipelining to False 32935 1726853720.97419: Set connection var ansible_connection to ssh 32935 1726853720.97423: Set connection var ansible_shell_executable to /bin/sh 32935 1726853720.97428: Set connection var ansible_module_compression to ZIP_DEFLATED 32935 1726853720.97447: variable 'ansible_shell_executable' from source: unknown 32935 1726853720.97450: variable 'ansible_connection' from source: unknown 32935 1726853720.97452: variable 'ansible_module_compression' from source: unknown 32935 1726853720.97455: variable 'ansible_shell_type' from source: unknown 32935 1726853720.97457: variable 'ansible_shell_executable' from source: unknown 32935 1726853720.97461: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853720.97464: variable 'ansible_pipelining' from source: unknown 32935 1726853720.97466: variable 'ansible_timeout' from source: unknown 32935 1726853720.97468: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853720.97566: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32935 1726853720.97576: variable 'omit' from source: magic vars 32935 1726853720.97581: starting attempt loop 32935 1726853720.97584: running the handler 32935 1726853720.97593: handler run complete 32935 1726853720.97601: attempt loop complete, returning result 32935 1726853720.97604: _execute() done 32935 1726853720.97606: dumping result to json 32935 1726853720.97608: done dumping result, returning 32935 1726853720.97616: done running TaskExecutor() for managed_node1/TASK: Set current_interfaces [02083763-bbaf-84df-441d-00000000034c] 32935 1726853720.97620: sending task result for task 02083763-bbaf-84df-441d-00000000034c 32935 1726853720.97698: done sending task result for task 02083763-bbaf-84df-441d-00000000034c 32935 1726853720.97701: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 32935 1726853720.97788: no more pending results, returning what we have 32935 1726853720.97790: results queue empty 32935 1726853720.97791: checking for any_errors_fatal 32935 1726853720.97796: done checking for any_errors_fatal 32935 1726853720.97796: checking for max_fail_percentage 32935 1726853720.97798: done checking for max_fail_percentage 32935 1726853720.97798: checking to see if all hosts have failed and the running result is not ok 32935 1726853720.97799: done checking to see if all hosts have failed 32935 1726853720.97800: getting the remaining hosts for this loop 32935 1726853720.97801: done getting the remaining hosts for this loop 32935 1726853720.97804: getting the next task for host managed_node1 32935 1726853720.97811: done getting next task for host managed_node1 32935 1726853720.97813: ^ task is: TASK: Show current_interfaces 32935 1726853720.97817: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853720.97820: getting variables 32935 1726853720.97821: in VariableManager get_vars() 32935 1726853720.97853: Calling all_inventory to load vars for managed_node1 32935 1726853720.97855: Calling groups_inventory to load vars for managed_node1 32935 1726853720.97859: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853720.97866: Calling all_plugins_play to load vars for managed_node1 32935 1726853720.97868: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853720.97870: Calling groups_plugins_play to load vars for managed_node1 32935 1726853720.97977: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853720.98110: done with get_vars() 32935 1726853720.98117: done getting variables 32935 1726853720.98155: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 13:35:20 -0400 (0:00:00.019) 0:00:06.117 ****** 32935 1726853720.98180: entering _queue_task() for managed_node1/debug 32935 1726853720.98373: worker is 1 (out of 1 available) 32935 1726853720.98388: exiting _queue_task() for managed_node1/debug 32935 1726853720.98401: done queuing things up, now waiting for results queue to drain 32935 1726853720.98403: waiting for pending results... 32935 1726853720.98544: running TaskExecutor() for managed_node1/TASK: Show current_interfaces 32935 1726853720.98614: in run() - task 02083763-bbaf-84df-441d-000000000315 32935 1726853720.98625: variable 'ansible_search_path' from source: unknown 32935 1726853720.98628: variable 'ansible_search_path' from source: unknown 32935 1726853720.98657: calling self._execute() 32935 1726853720.98721: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853720.98725: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853720.98734: variable 'omit' from source: magic vars 32935 1726853720.98994: variable 'ansible_distribution_major_version' from source: facts 32935 1726853720.99004: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853720.99010: variable 'omit' from source: magic vars 32935 1726853720.99040: variable 'omit' from source: magic vars 32935 1726853720.99106: variable 'current_interfaces' from source: set_fact 32935 1726853720.99127: variable 'omit' from source: magic vars 32935 1726853720.99160: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32935 1726853720.99188: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32935 1726853720.99204: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32935 1726853720.99217: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853720.99227: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853720.99249: variable 'inventory_hostname' from source: host vars for 'managed_node1' 32935 1726853720.99252: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853720.99255: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853720.99325: Set connection var ansible_timeout to 10 32935 1726853720.99330: Set connection var ansible_shell_type to sh 32935 1726853720.99337: Set connection var ansible_pipelining to False 32935 1726853720.99339: Set connection var ansible_connection to ssh 32935 1726853720.99344: Set connection var ansible_shell_executable to /bin/sh 32935 1726853720.99349: Set connection var ansible_module_compression to ZIP_DEFLATED 32935 1726853720.99367: variable 'ansible_shell_executable' from source: unknown 32935 1726853720.99372: variable 'ansible_connection' from source: unknown 32935 1726853720.99375: variable 'ansible_module_compression' from source: unknown 32935 1726853720.99377: variable 'ansible_shell_type' from source: unknown 32935 1726853720.99379: variable 'ansible_shell_executable' from source: unknown 32935 1726853720.99381: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853720.99385: variable 'ansible_pipelining' from source: unknown 32935 1726853720.99388: variable 'ansible_timeout' from source: unknown 32935 1726853720.99392: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853720.99499: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32935 1726853720.99511: variable 'omit' from source: magic vars 32935 1726853720.99514: starting attempt loop 32935 1726853720.99517: running the handler 32935 1726853720.99550: handler run complete 32935 1726853720.99563: attempt loop complete, returning result 32935 1726853720.99566: _execute() done 32935 1726853720.99568: dumping result to json 32935 1726853720.99572: done dumping result, returning 32935 1726853720.99576: done running TaskExecutor() for managed_node1/TASK: Show current_interfaces [02083763-bbaf-84df-441d-000000000315] 32935 1726853720.99581: sending task result for task 02083763-bbaf-84df-441d-000000000315 32935 1726853720.99656: done sending task result for task 02083763-bbaf-84df-441d-000000000315 32935 1726853720.99661: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 32935 1726853720.99710: no more pending results, returning what we have 32935 1726853720.99713: results queue empty 32935 1726853720.99714: checking for any_errors_fatal 32935 1726853720.99719: done checking for any_errors_fatal 32935 1726853720.99720: checking for max_fail_percentage 32935 1726853720.99721: done checking for max_fail_percentage 32935 1726853720.99722: checking to see if all hosts have failed and the running result is not ok 32935 1726853720.99723: done checking to see if all hosts have failed 32935 1726853720.99723: getting the remaining hosts for this loop 32935 1726853720.99725: done getting the remaining hosts for this loop 32935 1726853720.99728: getting the next task for host managed_node1 32935 1726853720.99736: done getting next task for host managed_node1 32935 1726853720.99739: ^ task is: TASK: Install iproute 32935 1726853720.99742: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853720.99745: getting variables 32935 1726853720.99747: in VariableManager get_vars() 32935 1726853720.99788: Calling all_inventory to load vars for managed_node1 32935 1726853720.99791: Calling groups_inventory to load vars for managed_node1 32935 1726853720.99792: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853720.99801: Calling all_plugins_play to load vars for managed_node1 32935 1726853720.99803: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853720.99805: Calling groups_plugins_play to load vars for managed_node1 32935 1726853720.99925: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853721.00040: done with get_vars() 32935 1726853721.00047: done getting variables 32935 1726853721.00090: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Friday 20 September 2024 13:35:20 -0400 (0:00:00.019) 0:00:06.136 ****** 32935 1726853721.00112: entering _queue_task() for managed_node1/package 32935 1726853721.00306: worker is 1 (out of 1 available) 32935 1726853721.00321: exiting _queue_task() for managed_node1/package 32935 1726853721.00334: done queuing things up, now waiting for results queue to drain 32935 1726853721.00336: waiting for pending results... 32935 1726853721.00480: running TaskExecutor() for managed_node1/TASK: Install iproute 32935 1726853721.00541: in run() - task 02083763-bbaf-84df-441d-00000000021e 32935 1726853721.00551: variable 'ansible_search_path' from source: unknown 32935 1726853721.00554: variable 'ansible_search_path' from source: unknown 32935 1726853721.00585: calling self._execute() 32935 1726853721.00648: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853721.00652: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853721.00663: variable 'omit' from source: magic vars 32935 1726853721.00935: variable 'ansible_distribution_major_version' from source: facts 32935 1726853721.00944: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853721.00950: variable 'omit' from source: magic vars 32935 1726853721.00976: variable 'omit' from source: magic vars 32935 1726853721.01102: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32935 1726853721.02847: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32935 1726853721.02895: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32935 1726853721.02921: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32935 1726853721.02945: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32935 1726853721.02978: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32935 1726853721.03047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32935 1726853721.03075: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32935 1726853721.03109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853721.03135: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32935 1726853721.03146: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32935 1726853721.03220: variable '__network_is_ostree' from source: set_fact 32935 1726853721.03225: variable 'omit' from source: magic vars 32935 1726853721.03247: variable 'omit' from source: magic vars 32935 1726853721.03269: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32935 1726853721.03293: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32935 1726853721.03307: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32935 1726853721.03327: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853721.03336: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853721.03361: variable 'inventory_hostname' from source: host vars for 'managed_node1' 32935 1726853721.03365: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853721.03367: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853721.03432: Set connection var ansible_timeout to 10 32935 1726853721.03439: Set connection var ansible_shell_type to sh 32935 1726853721.03446: Set connection var ansible_pipelining to False 32935 1726853721.03448: Set connection var ansible_connection to ssh 32935 1726853721.03453: Set connection var ansible_shell_executable to /bin/sh 32935 1726853721.03460: Set connection var ansible_module_compression to ZIP_DEFLATED 32935 1726853721.03484: variable 'ansible_shell_executable' from source: unknown 32935 1726853721.03487: variable 'ansible_connection' from source: unknown 32935 1726853721.03489: variable 'ansible_module_compression' from source: unknown 32935 1726853721.03492: variable 'ansible_shell_type' from source: unknown 32935 1726853721.03494: variable 'ansible_shell_executable' from source: unknown 32935 1726853721.03496: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853721.03498: variable 'ansible_pipelining' from source: unknown 32935 1726853721.03500: variable 'ansible_timeout' from source: unknown 32935 1726853721.03502: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853721.03572: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32935 1726853721.03581: variable 'omit' from source: magic vars 32935 1726853721.03587: starting attempt loop 32935 1726853721.03590: running the handler 32935 1726853721.03596: variable 'ansible_facts' from source: unknown 32935 1726853721.03599: variable 'ansible_facts' from source: unknown 32935 1726853721.03625: _low_level_execute_command(): starting 32935 1726853721.03631: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32935 1726853721.04131: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853721.04135: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853721.04138: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853721.04140: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 32935 1726853721.04143: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853721.04184: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853721.04198: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853721.04254: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853721.05905: stdout chunk (state=3): >>>/root <<< 32935 1726853721.06006: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853721.06033: stderr chunk (state=3): >>><<< 32935 1726853721.06036: stdout chunk (state=3): >>><<< 32935 1726853721.06057: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853721.06070: _low_level_execute_command(): starting 32935 1726853721.06077: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853721.0605567-33295-245758016638918 `" && echo ansible-tmp-1726853721.0605567-33295-245758016638918="` echo /root/.ansible/tmp/ansible-tmp-1726853721.0605567-33295-245758016638918 `" ) && sleep 0' 32935 1726853721.06490: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853721.06493: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853721.06496: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853721.06498: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853721.06546: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853721.06549: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853721.06597: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853721.08466: stdout chunk (state=3): >>>ansible-tmp-1726853721.0605567-33295-245758016638918=/root/.ansible/tmp/ansible-tmp-1726853721.0605567-33295-245758016638918 <<< 32935 1726853721.08580: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853721.08603: stderr chunk (state=3): >>><<< 32935 1726853721.08606: stdout chunk (state=3): >>><<< 32935 1726853721.08619: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853721.0605567-33295-245758016638918=/root/.ansible/tmp/ansible-tmp-1726853721.0605567-33295-245758016638918 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853721.08648: variable 'ansible_module_compression' from source: unknown 32935 1726853721.08695: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 32935 1726853721.08699: ANSIBALLZ: Acquiring lock 32935 1726853721.08701: ANSIBALLZ: Lock acquired: 140683294872048 32935 1726853721.08703: ANSIBALLZ: Creating module 32935 1726853721.25482: ANSIBALLZ: Writing module into payload 32935 1726853721.25794: ANSIBALLZ: Writing module 32935 1726853721.25798: ANSIBALLZ: Renaming module 32935 1726853721.25800: ANSIBALLZ: Done creating module 32935 1726853721.25802: variable 'ansible_facts' from source: unknown 32935 1726853721.26377: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853721.0605567-33295-245758016638918/AnsiballZ_dnf.py 32935 1726853721.26626: Sending initial data 32935 1726853721.26637: Sent initial data (152 bytes) 32935 1726853721.27687: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853721.27924: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853721.27998: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853721.28020: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853721.28086: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853721.28298: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853721.29928: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 32935 1726853721.29989: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 debug2: Sending SSH2_FXP_REALPATH "." <<< 32935 1726853721.30006: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853721.0605567-33295-245758016638918/AnsiballZ_dnf.py" <<< 32935 1726853721.30016: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32935vj31k4ae/tmp3_p4n5rh /root/.ansible/tmp/ansible-tmp-1726853721.0605567-33295-245758016638918/AnsiballZ_dnf.py <<< 32935 1726853721.30044: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-32935vj31k4ae/tmp3_p4n5rh" to remote "/root/.ansible/tmp/ansible-tmp-1726853721.0605567-33295-245758016638918/AnsiballZ_dnf.py" <<< 32935 1726853721.30095: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853721.0605567-33295-245758016638918/AnsiballZ_dnf.py" <<< 32935 1726853721.31526: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853721.31641: stderr chunk (state=3): >>><<< 32935 1726853721.31683: stdout chunk (state=3): >>><<< 32935 1726853721.31708: done transferring module to remote 32935 1726853721.31881: _low_level_execute_command(): starting 32935 1726853721.31885: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853721.0605567-33295-245758016638918/ /root/.ansible/tmp/ansible-tmp-1726853721.0605567-33295-245758016638918/AnsiballZ_dnf.py && sleep 0' 32935 1726853721.32984: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853721.33142: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853721.33287: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853721.33366: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853721.35285: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853721.35289: stderr chunk (state=3): >>><<< 32935 1726853721.35292: stdout chunk (state=3): >>><<< 32935 1726853721.35316: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853721.35325: _low_level_execute_command(): starting 32935 1726853721.35364: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853721.0605567-33295-245758016638918/AnsiballZ_dnf.py && sleep 0' 32935 1726853721.36567: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853721.36570: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32935 1726853721.36575: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853721.36577: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 32935 1726853721.36579: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853721.36894: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853721.37208: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853721.77925: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 32935 1726853721.82479: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 32935 1726853721.82483: stdout chunk (state=3): >>><<< 32935 1726853721.82486: stderr chunk (state=3): >>><<< 32935 1726853721.82488: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 32935 1726853721.82491: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853721.0605567-33295-245758016638918/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32935 1726853721.82497: _low_level_execute_command(): starting 32935 1726853721.82500: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853721.0605567-33295-245758016638918/ > /dev/null 2>&1 && sleep 0' 32935 1726853721.83791: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853721.83843: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853721.83865: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853721.83894: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853721.83969: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853721.85927: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853721.85962: stdout chunk (state=3): >>><<< 32935 1726853721.86279: stderr chunk (state=3): >>><<< 32935 1726853721.86282: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853721.86285: handler run complete 32935 1726853721.86877: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 32935 1726853721.87229: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 32935 1726853721.87276: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 32935 1726853721.87396: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 32935 1726853721.87476: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 32935 1726853721.87630: variable '__install_status' from source: unknown 32935 1726853721.87698: Evaluated conditional (__install_status is success): True 32935 1726853721.87787: attempt loop complete, returning result 32935 1726853721.87795: _execute() done 32935 1726853721.87802: dumping result to json 32935 1726853721.87813: done dumping result, returning 32935 1726853721.87826: done running TaskExecutor() for managed_node1/TASK: Install iproute [02083763-bbaf-84df-441d-00000000021e] 32935 1726853721.87835: sending task result for task 02083763-bbaf-84df-441d-00000000021e 32935 1726853721.88146: done sending task result for task 02083763-bbaf-84df-441d-00000000021e 32935 1726853721.88149: WORKER PROCESS EXITING ok: [managed_node1] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 32935 1726853721.88237: no more pending results, returning what we have 32935 1726853721.88241: results queue empty 32935 1726853721.88242: checking for any_errors_fatal 32935 1726853721.88247: done checking for any_errors_fatal 32935 1726853721.88248: checking for max_fail_percentage 32935 1726853721.88250: done checking for max_fail_percentage 32935 1726853721.88250: checking to see if all hosts have failed and the running result is not ok 32935 1726853721.88252: done checking to see if all hosts have failed 32935 1726853721.88252: getting the remaining hosts for this loop 32935 1726853721.88254: done getting the remaining hosts for this loop 32935 1726853721.88259: getting the next task for host managed_node1 32935 1726853721.88267: done getting next task for host managed_node1 32935 1726853721.88270: ^ task is: TASK: Create veth interface {{ interface }} 32935 1726853721.88275: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853721.88279: getting variables 32935 1726853721.88281: in VariableManager get_vars() 32935 1726853721.88319: Calling all_inventory to load vars for managed_node1 32935 1726853721.88322: Calling groups_inventory to load vars for managed_node1 32935 1726853721.88325: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853721.88336: Calling all_plugins_play to load vars for managed_node1 32935 1726853721.88339: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853721.88343: Calling groups_plugins_play to load vars for managed_node1 32935 1726853721.88856: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853721.89364: done with get_vars() 32935 1726853721.89577: done getting variables 32935 1726853721.89635: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 32935 1726853721.89866: variable 'interface' from source: play vars TASK [Create veth interface lsr101] ******************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Friday 20 September 2024 13:35:21 -0400 (0:00:00.900) 0:00:07.036 ****** 32935 1726853721.90126: entering _queue_task() for managed_node1/command 32935 1726853721.90928: worker is 1 (out of 1 available) 32935 1726853721.90942: exiting _queue_task() for managed_node1/command 32935 1726853721.90954: done queuing things up, now waiting for results queue to drain 32935 1726853721.90956: waiting for pending results... 32935 1726853721.91689: running TaskExecutor() for managed_node1/TASK: Create veth interface lsr101 32935 1726853721.91695: in run() - task 02083763-bbaf-84df-441d-00000000021f 32935 1726853721.91698: variable 'ansible_search_path' from source: unknown 32935 1726853721.91706: variable 'ansible_search_path' from source: unknown 32935 1726853721.92330: variable 'interface' from source: play vars 32935 1726853721.92531: variable 'interface' from source: play vars 32935 1726853721.92676: variable 'interface' from source: play vars 32935 1726853721.92934: Loaded config def from plugin (lookup/items) 32935 1726853721.92977: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 32935 1726853721.93002: variable 'omit' from source: magic vars 32935 1726853721.93336: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853721.93339: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853721.93342: variable 'omit' from source: magic vars 32935 1726853721.93887: variable 'ansible_distribution_major_version' from source: facts 32935 1726853721.93890: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853721.94321: variable 'type' from source: play vars 32935 1726853721.94341: variable 'state' from source: include params 32935 1726853721.94385: variable 'interface' from source: play vars 32935 1726853721.94394: variable 'current_interfaces' from source: set_fact 32935 1726853721.94449: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 32935 1726853721.94452: variable 'omit' from source: magic vars 32935 1726853721.94595: variable 'omit' from source: magic vars 32935 1726853721.94629: variable 'item' from source: unknown 32935 1726853721.94779: variable 'item' from source: unknown 32935 1726853721.94834: variable 'omit' from source: magic vars 32935 1726853721.94976: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32935 1726853721.94990: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32935 1726853721.95013: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32935 1726853721.95139: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853721.95143: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853721.95183: variable 'inventory_hostname' from source: host vars for 'managed_node1' 32935 1726853721.95196: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853721.95301: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853721.95519: Set connection var ansible_timeout to 10 32935 1726853721.95522: Set connection var ansible_shell_type to sh 32935 1726853721.95525: Set connection var ansible_pipelining to False 32935 1726853721.95527: Set connection var ansible_connection to ssh 32935 1726853721.95529: Set connection var ansible_shell_executable to /bin/sh 32935 1726853721.95531: Set connection var ansible_module_compression to ZIP_DEFLATED 32935 1726853721.95590: variable 'ansible_shell_executable' from source: unknown 32935 1726853721.95598: variable 'ansible_connection' from source: unknown 32935 1726853721.95605: variable 'ansible_module_compression' from source: unknown 32935 1726853721.95612: variable 'ansible_shell_type' from source: unknown 32935 1726853721.95618: variable 'ansible_shell_executable' from source: unknown 32935 1726853721.95630: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853721.95736: variable 'ansible_pipelining' from source: unknown 32935 1726853721.95739: variable 'ansible_timeout' from source: unknown 32935 1726853721.95741: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853721.95978: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32935 1726853721.95995: variable 'omit' from source: magic vars 32935 1726853721.96009: starting attempt loop 32935 1726853721.96120: running the handler 32935 1726853721.96123: _low_level_execute_command(): starting 32935 1726853721.96125: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32935 1726853721.97552: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32935 1726853721.97695: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853721.97765: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853721.97838: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853721.97880: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853721.97943: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853721.99873: stdout chunk (state=3): >>>/root <<< 32935 1726853722.00053: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853722.00056: stdout chunk (state=3): >>><<< 32935 1726853722.00061: stderr chunk (state=3): >>><<< 32935 1726853722.00064: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853722.00067: _low_level_execute_command(): starting 32935 1726853722.00070: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853721.999614-33339-21702543259950 `" && echo ansible-tmp-1726853721.999614-33339-21702543259950="` echo /root/.ansible/tmp/ansible-tmp-1726853721.999614-33339-21702543259950 `" ) && sleep 0' 32935 1726853722.01524: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853722.01528: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853722.01531: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853722.01534: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853722.01645: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853722.01977: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853722.02015: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853722.02054: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853722.03983: stdout chunk (state=3): >>>ansible-tmp-1726853721.999614-33339-21702543259950=/root/.ansible/tmp/ansible-tmp-1726853721.999614-33339-21702543259950 <<< 32935 1726853722.04127: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853722.04138: stdout chunk (state=3): >>><<< 32935 1726853722.04150: stderr chunk (state=3): >>><<< 32935 1726853722.04175: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853721.999614-33339-21702543259950=/root/.ansible/tmp/ansible-tmp-1726853721.999614-33339-21702543259950 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853722.04214: variable 'ansible_module_compression' from source: unknown 32935 1726853722.04476: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-32935vj31k4ae/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 32935 1726853722.04479: variable 'ansible_facts' from source: unknown 32935 1726853722.04550: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853721.999614-33339-21702543259950/AnsiballZ_command.py 32935 1726853722.05207: Sending initial data 32935 1726853722.05209: Sent initial data (154 bytes) 32935 1726853722.06107: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32935 1726853722.06120: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853722.06135: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853722.06470: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853722.06517: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853722.06802: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853722.08421: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32935 1726853722.08455: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32935 1726853722.08493: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32935vj31k4ae/tmpzl0ifn_5 /root/.ansible/tmp/ansible-tmp-1726853721.999614-33339-21702543259950/AnsiballZ_command.py <<< 32935 1726853722.08498: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853721.999614-33339-21702543259950/AnsiballZ_command.py" <<< 32935 1726853722.08553: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-32935vj31k4ae/tmpzl0ifn_5" to remote "/root/.ansible/tmp/ansible-tmp-1726853721.999614-33339-21702543259950/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853721.999614-33339-21702543259950/AnsiballZ_command.py" <<< 32935 1726853722.10049: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853722.10053: stdout chunk (state=3): >>><<< 32935 1726853722.10056: stderr chunk (state=3): >>><<< 32935 1726853722.10061: done transferring module to remote 32935 1726853722.10063: _low_level_execute_command(): starting 32935 1726853722.10065: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853721.999614-33339-21702543259950/ /root/.ansible/tmp/ansible-tmp-1726853721.999614-33339-21702543259950/AnsiballZ_command.py && sleep 0' 32935 1726853722.11165: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32935 1726853722.11187: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853722.11205: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853722.11289: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853722.11419: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853722.11484: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853722.11500: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853722.11563: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853722.13389: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853722.13400: stdout chunk (state=3): >>><<< 32935 1726853722.13419: stderr chunk (state=3): >>><<< 32935 1726853722.13450: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853722.13460: _low_level_execute_command(): starting 32935 1726853722.13473: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853721.999614-33339-21702543259950/AnsiballZ_command.py && sleep 0' 32935 1726853722.14047: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32935 1726853722.14060: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853722.14087: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853722.14109: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32935 1726853722.14204: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853722.14226: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853722.14306: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853722.30615: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "lsr101", "type", "veth", "peer", "name", "peerlsr101"], "start": "2024-09-20 13:35:22.293942", "end": "2024-09-20 13:35:22.299274", "delta": "0:00:00.005332", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add lsr101 type veth peer name peerlsr101", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 32935 1726853722.32777: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 32935 1726853722.32782: stdout chunk (state=3): >>><<< 32935 1726853722.32784: stderr chunk (state=3): >>><<< 32935 1726853722.32803: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "lsr101", "type", "veth", "peer", "name", "peerlsr101"], "start": "2024-09-20 13:35:22.293942", "end": "2024-09-20 13:35:22.299274", "delta": "0:00:00.005332", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add lsr101 type veth peer name peerlsr101", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 32935 1726853722.32897: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link add lsr101 type veth peer name peerlsr101', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853721.999614-33339-21702543259950/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32935 1726853722.33046: _low_level_execute_command(): starting 32935 1726853722.33049: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853721.999614-33339-21702543259950/ > /dev/null 2>&1 && sleep 0' 32935 1726853722.34135: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853722.34138: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 32935 1726853722.34141: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853722.34143: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853722.34150: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853722.34493: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853722.38378: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853722.38392: stdout chunk (state=3): >>><<< 32935 1726853722.38406: stderr chunk (state=3): >>><<< 32935 1726853722.38783: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853722.38787: handler run complete 32935 1726853722.38789: Evaluated conditional (False): False 32935 1726853722.38792: attempt loop complete, returning result 32935 1726853722.38794: variable 'item' from source: unknown 32935 1726853722.38795: variable 'item' from source: unknown ok: [managed_node1] => (item=ip link add lsr101 type veth peer name peerlsr101) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "add", "lsr101", "type", "veth", "peer", "name", "peerlsr101" ], "delta": "0:00:00.005332", "end": "2024-09-20 13:35:22.299274", "item": "ip link add lsr101 type veth peer name peerlsr101", "rc": 0, "start": "2024-09-20 13:35:22.293942" } 32935 1726853722.39363: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853722.39417: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853722.39515: variable 'omit' from source: magic vars 32935 1726853722.39914: variable 'ansible_distribution_major_version' from source: facts 32935 1726853722.39960: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853722.40341: variable 'type' from source: play vars 32935 1726853722.40390: variable 'state' from source: include params 32935 1726853722.40400: variable 'interface' from source: play vars 32935 1726853722.40477: variable 'current_interfaces' from source: set_fact 32935 1726853722.40480: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 32935 1726853722.40482: variable 'omit' from source: magic vars 32935 1726853722.40504: variable 'omit' from source: magic vars 32935 1726853722.40647: variable 'item' from source: unknown 32935 1726853722.40761: variable 'item' from source: unknown 32935 1726853722.40864: variable 'omit' from source: magic vars 32935 1726853722.40919: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32935 1726853722.40944: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853722.41042: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853722.41046: variable 'inventory_hostname' from source: host vars for 'managed_node1' 32935 1726853722.41048: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853722.41050: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853722.41152: Set connection var ansible_timeout to 10 32935 1726853722.41155: Set connection var ansible_shell_type to sh 32935 1726853722.41157: Set connection var ansible_pipelining to False 32935 1726853722.41188: Set connection var ansible_connection to ssh 32935 1726853722.41201: Set connection var ansible_shell_executable to /bin/sh 32935 1726853722.41313: Set connection var ansible_module_compression to ZIP_DEFLATED 32935 1726853722.41320: variable 'ansible_shell_executable' from source: unknown 32935 1726853722.41328: variable 'ansible_connection' from source: unknown 32935 1726853722.41335: variable 'ansible_module_compression' from source: unknown 32935 1726853722.41341: variable 'ansible_shell_type' from source: unknown 32935 1726853722.41408: variable 'ansible_shell_executable' from source: unknown 32935 1726853722.41411: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853722.41513: variable 'ansible_pipelining' from source: unknown 32935 1726853722.41516: variable 'ansible_timeout' from source: unknown 32935 1726853722.41518: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853722.41755: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32935 1726853722.41776: variable 'omit' from source: magic vars 32935 1726853722.41841: starting attempt loop 32935 1726853722.41844: running the handler 32935 1726853722.41846: _low_level_execute_command(): starting 32935 1726853722.41848: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32935 1726853722.42980: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32935 1726853722.43282: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853722.43316: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853722.43639: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853722.45294: stdout chunk (state=3): >>>/root <<< 32935 1726853722.45419: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853722.45427: stdout chunk (state=3): >>><<< 32935 1726853722.45430: stderr chunk (state=3): >>><<< 32935 1726853722.45447: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853722.45457: _low_level_execute_command(): starting 32935 1726853722.45465: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853722.4544704-33339-250662819855550 `" && echo ansible-tmp-1726853722.4544704-33339-250662819855550="` echo /root/.ansible/tmp/ansible-tmp-1726853722.4544704-33339-250662819855550 `" ) && sleep 0' 32935 1726853722.46777: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32935 1726853722.46782: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853722.46785: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853722.46787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32935 1726853722.46789: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 32935 1726853722.46791: stderr chunk (state=3): >>>debug2: match not found <<< 32935 1726853722.46793: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853722.46795: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 32935 1726853722.46796: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address <<< 32935 1726853722.46798: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 32935 1726853722.47142: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853722.47250: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853722.49072: stdout chunk (state=3): >>>ansible-tmp-1726853722.4544704-33339-250662819855550=/root/.ansible/tmp/ansible-tmp-1726853722.4544704-33339-250662819855550 <<< 32935 1726853722.49181: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853722.49226: stderr chunk (state=3): >>><<< 32935 1726853722.49234: stdout chunk (state=3): >>><<< 32935 1726853722.49256: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853722.4544704-33339-250662819855550=/root/.ansible/tmp/ansible-tmp-1726853722.4544704-33339-250662819855550 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853722.49285: variable 'ansible_module_compression' from source: unknown 32935 1726853722.49323: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-32935vj31k4ae/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 32935 1726853722.49343: variable 'ansible_facts' from source: unknown 32935 1726853722.49454: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853722.4544704-33339-250662819855550/AnsiballZ_command.py 32935 1726853722.49938: Sending initial data 32935 1726853722.49942: Sent initial data (156 bytes) 32935 1726853722.50978: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32935 1726853722.50982: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853722.50984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853722.50986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32935 1726853722.50988: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 32935 1726853722.50990: stderr chunk (state=3): >>>debug2: match not found <<< 32935 1726853722.50992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853722.51157: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853722.51277: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853722.51280: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853722.52772: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 32935 1726853722.52785: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32935 1726853722.52875: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32935 1726853722.53195: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32935vj31k4ae/tmp84jfyg92 /root/.ansible/tmp/ansible-tmp-1726853722.4544704-33339-250662819855550/AnsiballZ_command.py <<< 32935 1726853722.53201: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853722.4544704-33339-250662819855550/AnsiballZ_command.py" <<< 32935 1726853722.53239: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-32935vj31k4ae/tmp84jfyg92" to remote "/root/.ansible/tmp/ansible-tmp-1726853722.4544704-33339-250662819855550/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853722.4544704-33339-250662819855550/AnsiballZ_command.py" <<< 32935 1726853722.54833: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853722.54989: stderr chunk (state=3): >>><<< 32935 1726853722.54992: stdout chunk (state=3): >>><<< 32935 1726853722.54995: done transferring module to remote 32935 1726853722.54997: _low_level_execute_command(): starting 32935 1726853722.55000: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853722.4544704-33339-250662819855550/ /root/.ansible/tmp/ansible-tmp-1726853722.4544704-33339-250662819855550/AnsiballZ_command.py && sleep 0' 32935 1726853722.56829: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32935 1726853722.56837: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853722.56848: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853722.56865: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32935 1726853722.56902: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 32935 1726853722.56909: stderr chunk (state=3): >>>debug2: match not found <<< 32935 1726853722.56923: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853722.56941: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 32935 1726853722.57085: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853722.57225: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853722.57343: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853722.59246: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853722.59256: stdout chunk (state=3): >>><<< 32935 1726853722.59273: stderr chunk (state=3): >>><<< 32935 1726853722.59298: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853722.59467: _low_level_execute_command(): starting 32935 1726853722.59472: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853722.4544704-33339-250662819855550/AnsiballZ_command.py && sleep 0' 32935 1726853722.60840: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853722.60845: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32935 1726853722.60848: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 32935 1726853722.60851: stderr chunk (state=3): >>>debug2: match found <<< 32935 1726853722.60854: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853722.60912: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853722.60968: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853722.60988: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853722.61128: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853722.76708: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerlsr101", "up"], "start": "2024-09-20 13:35:22.762072", "end": "2024-09-20 13:35:22.766084", "delta": "0:00:00.004012", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerlsr101 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 32935 1726853722.78682: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 32935 1726853722.78687: stdout chunk (state=3): >>><<< 32935 1726853722.78689: stderr chunk (state=3): >>><<< 32935 1726853722.78692: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerlsr101", "up"], "start": "2024-09-20 13:35:22.762072", "end": "2024-09-20 13:35:22.766084", "delta": "0:00:00.004012", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerlsr101 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 32935 1726853722.78699: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set peerlsr101 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853722.4544704-33339-250662819855550/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32935 1726853722.78701: _low_level_execute_command(): starting 32935 1726853722.78707: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853722.4544704-33339-250662819855550/ > /dev/null 2>&1 && sleep 0' 32935 1726853722.79800: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32935 1726853722.80123: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853722.80129: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853722.80261: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853722.80514: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853722.80549: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853722.82425: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853722.82453: stdout chunk (state=3): >>><<< 32935 1726853722.82464: stderr chunk (state=3): >>><<< 32935 1726853722.82543: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853722.82549: handler run complete 32935 1726853722.82594: Evaluated conditional (False): False 32935 1726853722.82612: attempt loop complete, returning result 32935 1726853722.82762: variable 'item' from source: unknown 32935 1726853722.82929: variable 'item' from source: unknown ok: [managed_node1] => (item=ip link set peerlsr101 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "peerlsr101", "up" ], "delta": "0:00:00.004012", "end": "2024-09-20 13:35:22.766084", "item": "ip link set peerlsr101 up", "rc": 0, "start": "2024-09-20 13:35:22.762072" } 32935 1726853722.83344: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853722.83348: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853722.83778: variable 'omit' from source: magic vars 32935 1726853722.84051: variable 'ansible_distribution_major_version' from source: facts 32935 1726853722.84055: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853722.84515: variable 'type' from source: play vars 32935 1726853722.84523: variable 'state' from source: include params 32935 1726853722.84529: variable 'interface' from source: play vars 32935 1726853722.84535: variable 'current_interfaces' from source: set_fact 32935 1726853722.84613: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 32935 1726853722.84648: variable 'omit' from source: magic vars 32935 1726853722.84813: variable 'omit' from source: magic vars 32935 1726853722.84855: variable 'item' from source: unknown 32935 1726853722.85225: variable 'item' from source: unknown 32935 1726853722.85263: variable 'omit' from source: magic vars 32935 1726853722.85384: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32935 1726853722.85588: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853722.85592: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853722.85595: variable 'inventory_hostname' from source: host vars for 'managed_node1' 32935 1726853722.85598: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853722.85600: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853722.86091: Set connection var ansible_timeout to 10 32935 1726853722.86095: Set connection var ansible_shell_type to sh 32935 1726853722.86097: Set connection var ansible_pipelining to False 32935 1726853722.86101: Set connection var ansible_connection to ssh 32935 1726853722.86103: Set connection var ansible_shell_executable to /bin/sh 32935 1726853722.86105: Set connection var ansible_module_compression to ZIP_DEFLATED 32935 1726853722.86107: variable 'ansible_shell_executable' from source: unknown 32935 1726853722.86109: variable 'ansible_connection' from source: unknown 32935 1726853722.86111: variable 'ansible_module_compression' from source: unknown 32935 1726853722.86113: variable 'ansible_shell_type' from source: unknown 32935 1726853722.86115: variable 'ansible_shell_executable' from source: unknown 32935 1726853722.86123: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853722.86125: variable 'ansible_pipelining' from source: unknown 32935 1726853722.86127: variable 'ansible_timeout' from source: unknown 32935 1726853722.86186: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853722.86650: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32935 1726853722.86786: variable 'omit' from source: magic vars 32935 1726853722.86793: starting attempt loop 32935 1726853722.86796: running the handler 32935 1726853722.86800: _low_level_execute_command(): starting 32935 1726853722.86806: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32935 1726853722.88315: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853722.88318: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 32935 1726853722.88320: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 32935 1726853722.88322: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853722.88324: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853722.88545: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853722.88704: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853722.90162: stdout chunk (state=3): >>>/root <<< 32935 1726853722.90252: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853722.90312: stderr chunk (state=3): >>><<< 32935 1726853722.90330: stdout chunk (state=3): >>><<< 32935 1726853722.90354: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853722.90369: _low_level_execute_command(): starting 32935 1726853722.90381: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853722.9036114-33339-144659997285531 `" && echo ansible-tmp-1726853722.9036114-33339-144659997285531="` echo /root/.ansible/tmp/ansible-tmp-1726853722.9036114-33339-144659997285531 `" ) && sleep 0' 32935 1726853722.91601: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32935 1726853722.91622: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853722.91669: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853722.91767: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 32935 1726853722.91790: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853722.91886: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853722.91989: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853722.92049: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853722.93898: stdout chunk (state=3): >>>ansible-tmp-1726853722.9036114-33339-144659997285531=/root/.ansible/tmp/ansible-tmp-1726853722.9036114-33339-144659997285531 <<< 32935 1726853722.94039: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853722.94083: stdout chunk (state=3): >>><<< 32935 1726853722.94086: stderr chunk (state=3): >>><<< 32935 1726853722.94119: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853722.9036114-33339-144659997285531=/root/.ansible/tmp/ansible-tmp-1726853722.9036114-33339-144659997285531 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853722.94276: variable 'ansible_module_compression' from source: unknown 32935 1726853722.94279: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-32935vj31k4ae/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 32935 1726853722.94281: variable 'ansible_facts' from source: unknown 32935 1726853722.94300: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853722.9036114-33339-144659997285531/AnsiballZ_command.py 32935 1726853722.94417: Sending initial data 32935 1726853722.94521: Sent initial data (156 bytes) 32935 1726853722.95054: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32935 1726853722.95084: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853722.95093: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 32935 1726853722.95282: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address <<< 32935 1726853722.95286: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853722.95292: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853722.95294: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853722.95521: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853722.96797: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 <<< 32935 1726853722.96801: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32935 1726853722.96879: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32935 1726853722.96916: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32935vj31k4ae/tmpv4dq6neg /root/.ansible/tmp/ansible-tmp-1726853722.9036114-33339-144659997285531/AnsiballZ_command.py <<< 32935 1726853722.96930: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853722.9036114-33339-144659997285531/AnsiballZ_command.py" <<< 32935 1726853722.96992: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 32935 1726853722.97009: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-32935vj31k4ae/tmpv4dq6neg" to remote "/root/.ansible/tmp/ansible-tmp-1726853722.9036114-33339-144659997285531/AnsiballZ_command.py" <<< 32935 1726853722.97022: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853722.9036114-33339-144659997285531/AnsiballZ_command.py" <<< 32935 1726853722.97981: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853722.97984: stdout chunk (state=3): >>><<< 32935 1726853722.97988: stderr chunk (state=3): >>><<< 32935 1726853722.97990: done transferring module to remote 32935 1726853722.98028: _low_level_execute_command(): starting 32935 1726853722.98070: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853722.9036114-33339-144659997285531/ /root/.ansible/tmp/ansible-tmp-1726853722.9036114-33339-144659997285531/AnsiballZ_command.py && sleep 0' 32935 1726853722.99223: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32935 1726853722.99263: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853722.99378: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 32935 1726853722.99399: stderr chunk (state=3): >>>debug2: match found <<< 32935 1726853722.99461: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853722.99495: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853722.99510: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853722.99530: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853722.99609: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853723.01495: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853723.01498: stdout chunk (state=3): >>><<< 32935 1726853723.01539: stderr chunk (state=3): >>><<< 32935 1726853723.01601: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853723.01629: _low_level_execute_command(): starting 32935 1726853723.01634: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853722.9036114-33339-144659997285531/AnsiballZ_command.py && sleep 0' 32935 1726853723.02304: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32935 1726853723.02317: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853723.02330: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853723.02345: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32935 1726853723.02367: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853723.02460: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853723.02495: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853723.02612: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853723.18307: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "lsr101", "up"], "start": "2024-09-20 13:35:23.176747", "end": "2024-09-20 13:35:23.180545", "delta": "0:00:00.003798", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set lsr101 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 32935 1726853723.19828: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 32935 1726853723.19918: stderr chunk (state=3): >>><<< 32935 1726853723.19922: stdout chunk (state=3): >>><<< 32935 1726853723.19983: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "lsr101", "up"], "start": "2024-09-20 13:35:23.176747", "end": "2024-09-20 13:35:23.180545", "delta": "0:00:00.003798", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set lsr101 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 32935 1726853723.19987: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set lsr101 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853722.9036114-33339-144659997285531/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32935 1726853723.20051: _low_level_execute_command(): starting 32935 1726853723.20091: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853722.9036114-33339-144659997285531/ > /dev/null 2>&1 && sleep 0' 32935 1726853723.21520: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853723.21750: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853723.21753: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853723.21756: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853723.21875: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853723.23724: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853723.23794: stdout chunk (state=3): >>><<< 32935 1726853723.23798: stderr chunk (state=3): >>><<< 32935 1726853723.23958: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853723.23962: handler run complete 32935 1726853723.23965: Evaluated conditional (False): False 32935 1726853723.23967: attempt loop complete, returning result 32935 1726853723.23969: variable 'item' from source: unknown 32935 1726853723.24278: variable 'item' from source: unknown ok: [managed_node1] => (item=ip link set lsr101 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "lsr101", "up" ], "delta": "0:00:00.003798", "end": "2024-09-20 13:35:23.180545", "item": "ip link set lsr101 up", "rc": 0, "start": "2024-09-20 13:35:23.176747" } 32935 1726853723.24378: dumping result to json 32935 1726853723.24776: done dumping result, returning 32935 1726853723.24779: done running TaskExecutor() for managed_node1/TASK: Create veth interface lsr101 [02083763-bbaf-84df-441d-00000000021f] 32935 1726853723.24781: sending task result for task 02083763-bbaf-84df-441d-00000000021f 32935 1726853723.24950: no more pending results, returning what we have 32935 1726853723.24954: results queue empty 32935 1726853723.24955: checking for any_errors_fatal 32935 1726853723.24959: done checking for any_errors_fatal 32935 1726853723.24960: checking for max_fail_percentage 32935 1726853723.24962: done checking for max_fail_percentage 32935 1726853723.24962: checking to see if all hosts have failed and the running result is not ok 32935 1726853723.24969: done checking to see if all hosts have failed 32935 1726853723.24970: getting the remaining hosts for this loop 32935 1726853723.24973: done getting the remaining hosts for this loop 32935 1726853723.24977: getting the next task for host managed_node1 32935 1726853723.24985: done getting next task for host managed_node1 32935 1726853723.24987: ^ task is: TASK: Set up veth as managed by NetworkManager 32935 1726853723.24990: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853723.24994: getting variables 32935 1726853723.24996: in VariableManager get_vars() 32935 1726853723.25294: Calling all_inventory to load vars for managed_node1 32935 1726853723.25297: Calling groups_inventory to load vars for managed_node1 32935 1726853723.25301: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853723.25307: done sending task result for task 02083763-bbaf-84df-441d-00000000021f 32935 1726853723.25310: WORKER PROCESS EXITING 32935 1726853723.25318: Calling all_plugins_play to load vars for managed_node1 32935 1726853723.25321: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853723.25324: Calling groups_plugins_play to load vars for managed_node1 32935 1726853723.26068: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853723.26693: done with get_vars() 32935 1726853723.26705: done getting variables 32935 1726853723.26764: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Friday 20 September 2024 13:35:23 -0400 (0:00:01.369) 0:00:08.406 ****** 32935 1726853723.27122: entering _queue_task() for managed_node1/command 32935 1726853723.27724: worker is 1 (out of 1 available) 32935 1726853723.27738: exiting _queue_task() for managed_node1/command 32935 1726853723.27752: done queuing things up, now waiting for results queue to drain 32935 1726853723.27868: waiting for pending results... 32935 1726853723.28166: running TaskExecutor() for managed_node1/TASK: Set up veth as managed by NetworkManager 32935 1726853723.28374: in run() - task 02083763-bbaf-84df-441d-000000000220 32935 1726853723.28490: variable 'ansible_search_path' from source: unknown 32935 1726853723.28494: variable 'ansible_search_path' from source: unknown 32935 1726853723.28533: calling self._execute() 32935 1726853723.28736: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853723.28742: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853723.28751: variable 'omit' from source: magic vars 32935 1726853723.29554: variable 'ansible_distribution_major_version' from source: facts 32935 1726853723.29608: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853723.29949: variable 'type' from source: play vars 32935 1726853723.29953: variable 'state' from source: include params 32935 1726853723.29986: Evaluated conditional (type == 'veth' and state == 'present'): True 32935 1726853723.29989: variable 'omit' from source: magic vars 32935 1726853723.30073: variable 'omit' from source: magic vars 32935 1726853723.30287: variable 'interface' from source: play vars 32935 1726853723.30304: variable 'omit' from source: magic vars 32935 1726853723.30363: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32935 1726853723.30498: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32935 1726853723.30518: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32935 1726853723.30536: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853723.30548: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853723.30694: variable 'inventory_hostname' from source: host vars for 'managed_node1' 32935 1726853723.30698: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853723.30776: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853723.30911: Set connection var ansible_timeout to 10 32935 1726853723.30918: Set connection var ansible_shell_type to sh 32935 1726853723.30926: Set connection var ansible_pipelining to False 32935 1726853723.30929: Set connection var ansible_connection to ssh 32935 1726853723.30934: Set connection var ansible_shell_executable to /bin/sh 32935 1726853723.30939: Set connection var ansible_module_compression to ZIP_DEFLATED 32935 1726853723.30968: variable 'ansible_shell_executable' from source: unknown 32935 1726853723.30973: variable 'ansible_connection' from source: unknown 32935 1726853723.30975: variable 'ansible_module_compression' from source: unknown 32935 1726853723.30978: variable 'ansible_shell_type' from source: unknown 32935 1726853723.30980: variable 'ansible_shell_executable' from source: unknown 32935 1726853723.30982: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853723.30987: variable 'ansible_pipelining' from source: unknown 32935 1726853723.30989: variable 'ansible_timeout' from source: unknown 32935 1726853723.30993: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853723.31305: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32935 1726853723.31338: variable 'omit' from source: magic vars 32935 1726853723.31341: starting attempt loop 32935 1726853723.31343: running the handler 32935 1726853723.31452: _low_level_execute_command(): starting 32935 1726853723.31464: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32935 1726853723.32897: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853723.32935: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853723.32939: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853723.32994: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853723.33084: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853723.34744: stdout chunk (state=3): >>>/root <<< 32935 1726853723.34846: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853723.34924: stderr chunk (state=3): >>><<< 32935 1726853723.34932: stdout chunk (state=3): >>><<< 32935 1726853723.34954: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853723.35139: _low_level_execute_command(): starting 32935 1726853723.35147: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853723.3504448-33399-197910742199623 `" && echo ansible-tmp-1726853723.3504448-33399-197910742199623="` echo /root/.ansible/tmp/ansible-tmp-1726853723.3504448-33399-197910742199623 `" ) && sleep 0' 32935 1726853723.36103: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853723.36115: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853723.36130: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853723.36253: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853723.38477: stdout chunk (state=3): >>>ansible-tmp-1726853723.3504448-33399-197910742199623=/root/.ansible/tmp/ansible-tmp-1726853723.3504448-33399-197910742199623 <<< 32935 1726853723.38481: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853723.38484: stdout chunk (state=3): >>><<< 32935 1726853723.38486: stderr chunk (state=3): >>><<< 32935 1726853723.38489: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853723.3504448-33399-197910742199623=/root/.ansible/tmp/ansible-tmp-1726853723.3504448-33399-197910742199623 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853723.38491: variable 'ansible_module_compression' from source: unknown 32935 1726853723.38493: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-32935vj31k4ae/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 32935 1726853723.38495: variable 'ansible_facts' from source: unknown 32935 1726853723.38552: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853723.3504448-33399-197910742199623/AnsiballZ_command.py 32935 1726853723.38921: Sending initial data 32935 1726853723.38930: Sent initial data (156 bytes) 32935 1726853723.39429: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32935 1726853723.39452: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853723.39584: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853723.39625: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853723.39691: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853723.41208: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 32935 1726853723.41230: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32935 1726853723.41290: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32935 1726853723.41324: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32935vj31k4ae/tmps4hkib91 /root/.ansible/tmp/ansible-tmp-1726853723.3504448-33399-197910742199623/AnsiballZ_command.py <<< 32935 1726853723.41346: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853723.3504448-33399-197910742199623/AnsiballZ_command.py" <<< 32935 1726853723.41376: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-32935vj31k4ae/tmps4hkib91" to remote "/root/.ansible/tmp/ansible-tmp-1726853723.3504448-33399-197910742199623/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853723.3504448-33399-197910742199623/AnsiballZ_command.py" <<< 32935 1726853723.42250: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853723.42254: stdout chunk (state=3): >>><<< 32935 1726853723.42256: stderr chunk (state=3): >>><<< 32935 1726853723.42265: done transferring module to remote 32935 1726853723.42283: _low_level_execute_command(): starting 32935 1726853723.42292: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853723.3504448-33399-197910742199623/ /root/.ansible/tmp/ansible-tmp-1726853723.3504448-33399-197910742199623/AnsiballZ_command.py && sleep 0' 32935 1726853723.43406: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853723.43488: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853723.43570: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853723.45499: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853723.45513: stderr chunk (state=3): >>><<< 32935 1726853723.45520: stdout chunk (state=3): >>><<< 32935 1726853723.45543: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853723.45555: _low_level_execute_command(): starting 32935 1726853723.45565: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853723.3504448-33399-197910742199623/AnsiballZ_command.py && sleep 0' 32935 1726853723.46704: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853723.46760: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853723.47004: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853723.47007: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853723.47067: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853723.64300: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "lsr101", "managed", "true"], "start": "2024-09-20 13:35:23.622355", "end": "2024-09-20 13:35:23.641485", "delta": "0:00:00.019130", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set lsr101 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 32935 1726853723.66454: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 32935 1726853723.66459: stdout chunk (state=3): >>><<< 32935 1726853723.66461: stderr chunk (state=3): >>><<< 32935 1726853723.66464: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "lsr101", "managed", "true"], "start": "2024-09-20 13:35:23.622355", "end": "2024-09-20 13:35:23.641485", "delta": "0:00:00.019130", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set lsr101 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 32935 1726853723.66467: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli d set lsr101 managed true', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853723.3504448-33399-197910742199623/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32935 1726853723.66469: _low_level_execute_command(): starting 32935 1726853723.66474: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853723.3504448-33399-197910742199623/ > /dev/null 2>&1 && sleep 0' 32935 1726853723.67891: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853723.68114: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853723.68145: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853723.70037: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853723.70048: stdout chunk (state=3): >>><<< 32935 1726853723.70062: stderr chunk (state=3): >>><<< 32935 1726853723.70090: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853723.70277: handler run complete 32935 1726853723.70281: Evaluated conditional (False): False 32935 1726853723.70283: attempt loop complete, returning result 32935 1726853723.70285: _execute() done 32935 1726853723.70287: dumping result to json 32935 1726853723.70289: done dumping result, returning 32935 1726853723.70291: done running TaskExecutor() for managed_node1/TASK: Set up veth as managed by NetworkManager [02083763-bbaf-84df-441d-000000000220] 32935 1726853723.70293: sending task result for task 02083763-bbaf-84df-441d-000000000220 32935 1726853723.70648: done sending task result for task 02083763-bbaf-84df-441d-000000000220 32935 1726853723.70651: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "nmcli", "d", "set", "lsr101", "managed", "true" ], "delta": "0:00:00.019130", "end": "2024-09-20 13:35:23.641485", "rc": 0, "start": "2024-09-20 13:35:23.622355" } 32935 1726853723.70733: no more pending results, returning what we have 32935 1726853723.70736: results queue empty 32935 1726853723.70737: checking for any_errors_fatal 32935 1726853723.70748: done checking for any_errors_fatal 32935 1726853723.70749: checking for max_fail_percentage 32935 1726853723.70750: done checking for max_fail_percentage 32935 1726853723.70751: checking to see if all hosts have failed and the running result is not ok 32935 1726853723.70752: done checking to see if all hosts have failed 32935 1726853723.70753: getting the remaining hosts for this loop 32935 1726853723.70754: done getting the remaining hosts for this loop 32935 1726853723.70757: getting the next task for host managed_node1 32935 1726853723.70764: done getting next task for host managed_node1 32935 1726853723.70766: ^ task is: TASK: Delete veth interface {{ interface }} 32935 1726853723.70768: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853723.70774: getting variables 32935 1726853723.70775: in VariableManager get_vars() 32935 1726853723.70812: Calling all_inventory to load vars for managed_node1 32935 1726853723.70814: Calling groups_inventory to load vars for managed_node1 32935 1726853723.70817: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853723.70825: Calling all_plugins_play to load vars for managed_node1 32935 1726853723.70827: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853723.70830: Calling groups_plugins_play to load vars for managed_node1 32935 1726853723.71545: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853723.72140: done with get_vars() 32935 1726853723.72152: done getting variables 32935 1726853723.72326: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 32935 1726853723.72637: variable 'interface' from source: play vars TASK [Delete veth interface lsr101] ******************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Friday 20 September 2024 13:35:23 -0400 (0:00:00.455) 0:00:08.862 ****** 32935 1726853723.72668: entering _queue_task() for managed_node1/command 32935 1726853723.73586: worker is 1 (out of 1 available) 32935 1726853723.73601: exiting _queue_task() for managed_node1/command 32935 1726853723.73612: done queuing things up, now waiting for results queue to drain 32935 1726853723.73613: waiting for pending results... 32935 1726853723.74150: running TaskExecutor() for managed_node1/TASK: Delete veth interface lsr101 32935 1726853723.74247: in run() - task 02083763-bbaf-84df-441d-000000000221 32935 1726853723.74381: variable 'ansible_search_path' from source: unknown 32935 1726853723.74462: variable 'ansible_search_path' from source: unknown 32935 1726853723.74466: calling self._execute() 32935 1726853723.74679: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853723.74684: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853723.74686: variable 'omit' from source: magic vars 32935 1726853723.75367: variable 'ansible_distribution_major_version' from source: facts 32935 1726853723.75458: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853723.75893: variable 'type' from source: play vars 32935 1726853723.75903: variable 'state' from source: include params 32935 1726853723.75912: variable 'interface' from source: play vars 32935 1726853723.75921: variable 'current_interfaces' from source: set_fact 32935 1726853723.75934: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): False 32935 1726853723.75941: when evaluation is False, skipping this task 32935 1726853723.75948: _execute() done 32935 1726853723.75955: dumping result to json 32935 1726853723.75961: done dumping result, returning 32935 1726853723.75973: done running TaskExecutor() for managed_node1/TASK: Delete veth interface lsr101 [02083763-bbaf-84df-441d-000000000221] 32935 1726853723.76212: sending task result for task 02083763-bbaf-84df-441d-000000000221 32935 1726853723.76290: done sending task result for task 02083763-bbaf-84df-441d-000000000221 32935 1726853723.76294: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'veth' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 32935 1726853723.76364: no more pending results, returning what we have 32935 1726853723.76368: results queue empty 32935 1726853723.76369: checking for any_errors_fatal 32935 1726853723.76382: done checking for any_errors_fatal 32935 1726853723.76383: checking for max_fail_percentage 32935 1726853723.76385: done checking for max_fail_percentage 32935 1726853723.76386: checking to see if all hosts have failed and the running result is not ok 32935 1726853723.76387: done checking to see if all hosts have failed 32935 1726853723.76388: getting the remaining hosts for this loop 32935 1726853723.76390: done getting the remaining hosts for this loop 32935 1726853723.76393: getting the next task for host managed_node1 32935 1726853723.76401: done getting next task for host managed_node1 32935 1726853723.76403: ^ task is: TASK: Create dummy interface {{ interface }} 32935 1726853723.76407: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853723.76412: getting variables 32935 1726853723.76414: in VariableManager get_vars() 32935 1726853723.76460: Calling all_inventory to load vars for managed_node1 32935 1726853723.76463: Calling groups_inventory to load vars for managed_node1 32935 1726853723.76465: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853723.76781: Calling all_plugins_play to load vars for managed_node1 32935 1726853723.76785: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853723.76788: Calling groups_plugins_play to load vars for managed_node1 32935 1726853723.76955: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853723.77535: done with get_vars() 32935 1726853723.77545: done getting variables 32935 1726853723.77604: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 32935 1726853723.77887: variable 'interface' from source: play vars TASK [Create dummy interface lsr101] ******************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Friday 20 September 2024 13:35:23 -0400 (0:00:00.052) 0:00:08.914 ****** 32935 1726853723.77916: entering _queue_task() for managed_node1/command 32935 1726853723.78663: worker is 1 (out of 1 available) 32935 1726853723.78679: exiting _queue_task() for managed_node1/command 32935 1726853723.78693: done queuing things up, now waiting for results queue to drain 32935 1726853723.78773: waiting for pending results... 32935 1726853723.78886: running TaskExecutor() for managed_node1/TASK: Create dummy interface lsr101 32935 1726853723.79277: in run() - task 02083763-bbaf-84df-441d-000000000222 32935 1726853723.79288: variable 'ansible_search_path' from source: unknown 32935 1726853723.79292: variable 'ansible_search_path' from source: unknown 32935 1726853723.79441: calling self._execute() 32935 1726853723.79572: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853723.79576: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853723.79587: variable 'omit' from source: magic vars 32935 1726853723.80422: variable 'ansible_distribution_major_version' from source: facts 32935 1726853723.80519: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853723.80901: variable 'type' from source: play vars 32935 1726853723.81056: variable 'state' from source: include params 32935 1726853723.81081: variable 'interface' from source: play vars 32935 1726853723.81091: variable 'current_interfaces' from source: set_fact 32935 1726853723.81105: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 32935 1726853723.81112: when evaluation is False, skipping this task 32935 1726853723.81119: _execute() done 32935 1726853723.81126: dumping result to json 32935 1726853723.81133: done dumping result, returning 32935 1726853723.81144: done running TaskExecutor() for managed_node1/TASK: Create dummy interface lsr101 [02083763-bbaf-84df-441d-000000000222] 32935 1726853723.81163: sending task result for task 02083763-bbaf-84df-441d-000000000222 skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 32935 1726853723.81394: no more pending results, returning what we have 32935 1726853723.81397: results queue empty 32935 1726853723.81398: checking for any_errors_fatal 32935 1726853723.81404: done checking for any_errors_fatal 32935 1726853723.81405: checking for max_fail_percentage 32935 1726853723.81406: done checking for max_fail_percentage 32935 1726853723.81407: checking to see if all hosts have failed and the running result is not ok 32935 1726853723.81408: done checking to see if all hosts have failed 32935 1726853723.81409: getting the remaining hosts for this loop 32935 1726853723.81410: done getting the remaining hosts for this loop 32935 1726853723.81444: getting the next task for host managed_node1 32935 1726853723.81452: done getting next task for host managed_node1 32935 1726853723.81455: ^ task is: TASK: Delete dummy interface {{ interface }} 32935 1726853723.81458: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853723.81462: getting variables 32935 1726853723.81464: in VariableManager get_vars() 32935 1726853723.81509: Calling all_inventory to load vars for managed_node1 32935 1726853723.81512: Calling groups_inventory to load vars for managed_node1 32935 1726853723.81515: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853723.81642: Calling all_plugins_play to load vars for managed_node1 32935 1726853723.81645: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853723.81651: done sending task result for task 02083763-bbaf-84df-441d-000000000222 32935 1726853723.81653: WORKER PROCESS EXITING 32935 1726853723.81661: Calling groups_plugins_play to load vars for managed_node1 32935 1726853723.81943: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853723.82367: done with get_vars() 32935 1726853723.82482: done getting variables 32935 1726853723.82644: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 32935 1726853723.82889: variable 'interface' from source: play vars TASK [Delete dummy interface lsr101] ******************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Friday 20 September 2024 13:35:23 -0400 (0:00:00.049) 0:00:08.964 ****** 32935 1726853723.82918: entering _queue_task() for managed_node1/command 32935 1726853723.83486: worker is 1 (out of 1 available) 32935 1726853723.83499: exiting _queue_task() for managed_node1/command 32935 1726853723.83511: done queuing things up, now waiting for results queue to drain 32935 1726853723.83513: waiting for pending results... 32935 1726853723.83669: running TaskExecutor() for managed_node1/TASK: Delete dummy interface lsr101 32935 1726853723.83783: in run() - task 02083763-bbaf-84df-441d-000000000223 32935 1726853723.83806: variable 'ansible_search_path' from source: unknown 32935 1726853723.83812: variable 'ansible_search_path' from source: unknown 32935 1726853723.83849: calling self._execute() 32935 1726853723.83936: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853723.83946: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853723.83958: variable 'omit' from source: magic vars 32935 1726853723.84295: variable 'ansible_distribution_major_version' from source: facts 32935 1726853723.84314: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853723.84521: variable 'type' from source: play vars 32935 1726853723.84532: variable 'state' from source: include params 32935 1726853723.84541: variable 'interface' from source: play vars 32935 1726853723.84557: variable 'current_interfaces' from source: set_fact 32935 1726853723.84573: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 32935 1726853723.84665: when evaluation is False, skipping this task 32935 1726853723.84669: _execute() done 32935 1726853723.84673: dumping result to json 32935 1726853723.84676: done dumping result, returning 32935 1726853723.84679: done running TaskExecutor() for managed_node1/TASK: Delete dummy interface lsr101 [02083763-bbaf-84df-441d-000000000223] 32935 1726853723.84681: sending task result for task 02083763-bbaf-84df-441d-000000000223 32935 1726853723.84744: done sending task result for task 02083763-bbaf-84df-441d-000000000223 32935 1726853723.84747: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 32935 1726853723.84821: no more pending results, returning what we have 32935 1726853723.84825: results queue empty 32935 1726853723.84826: checking for any_errors_fatal 32935 1726853723.84832: done checking for any_errors_fatal 32935 1726853723.84833: checking for max_fail_percentage 32935 1726853723.84835: done checking for max_fail_percentage 32935 1726853723.84835: checking to see if all hosts have failed and the running result is not ok 32935 1726853723.84837: done checking to see if all hosts have failed 32935 1726853723.84838: getting the remaining hosts for this loop 32935 1726853723.84840: done getting the remaining hosts for this loop 32935 1726853723.84843: getting the next task for host managed_node1 32935 1726853723.84851: done getting next task for host managed_node1 32935 1726853723.84854: ^ task is: TASK: Create tap interface {{ interface }} 32935 1726853723.84857: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853723.84861: getting variables 32935 1726853723.84863: in VariableManager get_vars() 32935 1726853723.84914: Calling all_inventory to load vars for managed_node1 32935 1726853723.84917: Calling groups_inventory to load vars for managed_node1 32935 1726853723.84920: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853723.84932: Calling all_plugins_play to load vars for managed_node1 32935 1726853723.84936: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853723.84939: Calling groups_plugins_play to load vars for managed_node1 32935 1726853723.85431: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853723.85629: done with get_vars() 32935 1726853723.85645: done getting variables 32935 1726853723.85703: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 32935 1726853723.85815: variable 'interface' from source: play vars TASK [Create tap interface lsr101] ********************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Friday 20 September 2024 13:35:23 -0400 (0:00:00.029) 0:00:08.994 ****** 32935 1726853723.85844: entering _queue_task() for managed_node1/command 32935 1726853723.86147: worker is 1 (out of 1 available) 32935 1726853723.86160: exiting _queue_task() for managed_node1/command 32935 1726853723.86294: done queuing things up, now waiting for results queue to drain 32935 1726853723.86297: waiting for pending results... 32935 1726853723.86473: running TaskExecutor() for managed_node1/TASK: Create tap interface lsr101 32935 1726853723.86589: in run() - task 02083763-bbaf-84df-441d-000000000224 32935 1726853723.86614: variable 'ansible_search_path' from source: unknown 32935 1726853723.86624: variable 'ansible_search_path' from source: unknown 32935 1726853723.86667: calling self._execute() 32935 1726853723.86757: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853723.86767: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853723.86786: variable 'omit' from source: magic vars 32935 1726853723.87163: variable 'ansible_distribution_major_version' from source: facts 32935 1726853723.87186: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853723.87386: variable 'type' from source: play vars 32935 1726853723.87397: variable 'state' from source: include params 32935 1726853723.87405: variable 'interface' from source: play vars 32935 1726853723.87421: variable 'current_interfaces' from source: set_fact 32935 1726853723.87814: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 32935 1726853723.87817: when evaluation is False, skipping this task 32935 1726853723.87820: _execute() done 32935 1726853723.87822: dumping result to json 32935 1726853723.87824: done dumping result, returning 32935 1726853723.87826: done running TaskExecutor() for managed_node1/TASK: Create tap interface lsr101 [02083763-bbaf-84df-441d-000000000224] 32935 1726853723.87829: sending task result for task 02083763-bbaf-84df-441d-000000000224 32935 1726853723.87905: done sending task result for task 02083763-bbaf-84df-441d-000000000224 32935 1726853723.87908: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 32935 1726853723.87964: no more pending results, returning what we have 32935 1726853723.87967: results queue empty 32935 1726853723.87968: checking for any_errors_fatal 32935 1726853723.87975: done checking for any_errors_fatal 32935 1726853723.87976: checking for max_fail_percentage 32935 1726853723.87978: done checking for max_fail_percentage 32935 1726853723.87978: checking to see if all hosts have failed and the running result is not ok 32935 1726853723.87979: done checking to see if all hosts have failed 32935 1726853723.87980: getting the remaining hosts for this loop 32935 1726853723.87982: done getting the remaining hosts for this loop 32935 1726853723.87986: getting the next task for host managed_node1 32935 1726853723.87993: done getting next task for host managed_node1 32935 1726853723.87995: ^ task is: TASK: Delete tap interface {{ interface }} 32935 1726853723.87998: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853723.88002: getting variables 32935 1726853723.88003: in VariableManager get_vars() 32935 1726853723.88042: Calling all_inventory to load vars for managed_node1 32935 1726853723.88045: Calling groups_inventory to load vars for managed_node1 32935 1726853723.88047: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853723.88059: Calling all_plugins_play to load vars for managed_node1 32935 1726853723.88062: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853723.88065: Calling groups_plugins_play to load vars for managed_node1 32935 1726853723.88342: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853723.88551: done with get_vars() 32935 1726853723.88563: done getting variables 32935 1726853723.88621: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 32935 1726853723.88735: variable 'interface' from source: play vars TASK [Delete tap interface lsr101] ********************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Friday 20 September 2024 13:35:23 -0400 (0:00:00.029) 0:00:09.023 ****** 32935 1726853723.88770: entering _queue_task() for managed_node1/command 32935 1726853723.89033: worker is 1 (out of 1 available) 32935 1726853723.89048: exiting _queue_task() for managed_node1/command 32935 1726853723.89064: done queuing things up, now waiting for results queue to drain 32935 1726853723.89066: waiting for pending results... 32935 1726853723.89388: running TaskExecutor() for managed_node1/TASK: Delete tap interface lsr101 32935 1726853723.89397: in run() - task 02083763-bbaf-84df-441d-000000000225 32935 1726853723.89418: variable 'ansible_search_path' from source: unknown 32935 1726853723.89427: variable 'ansible_search_path' from source: unknown 32935 1726853723.89477: calling self._execute() 32935 1726853723.89594: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853723.89597: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853723.89599: variable 'omit' from source: magic vars 32935 1726853723.89942: variable 'ansible_distribution_major_version' from source: facts 32935 1726853723.89961: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853723.90169: variable 'type' from source: play vars 32935 1726853723.90181: variable 'state' from source: include params 32935 1726853723.90246: variable 'interface' from source: play vars 32935 1726853723.90249: variable 'current_interfaces' from source: set_fact 32935 1726853723.90251: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 32935 1726853723.90253: when evaluation is False, skipping this task 32935 1726853723.90255: _execute() done 32935 1726853723.90257: dumping result to json 32935 1726853723.90259: done dumping result, returning 32935 1726853723.90261: done running TaskExecutor() for managed_node1/TASK: Delete tap interface lsr101 [02083763-bbaf-84df-441d-000000000225] 32935 1726853723.90263: sending task result for task 02083763-bbaf-84df-441d-000000000225 skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 32935 1726853723.90518: no more pending results, returning what we have 32935 1726853723.90521: results queue empty 32935 1726853723.90522: checking for any_errors_fatal 32935 1726853723.90526: done checking for any_errors_fatal 32935 1726853723.90527: checking for max_fail_percentage 32935 1726853723.90528: done checking for max_fail_percentage 32935 1726853723.90529: checking to see if all hosts have failed and the running result is not ok 32935 1726853723.90530: done checking to see if all hosts have failed 32935 1726853723.90531: getting the remaining hosts for this loop 32935 1726853723.90532: done getting the remaining hosts for this loop 32935 1726853723.90535: getting the next task for host managed_node1 32935 1726853723.90544: done getting next task for host managed_node1 32935 1726853723.90546: ^ task is: TASK: Include the task 'assert_device_present.yml' 32935 1726853723.90548: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853723.90552: getting variables 32935 1726853723.90554: in VariableManager get_vars() 32935 1726853723.90594: Calling all_inventory to load vars for managed_node1 32935 1726853723.90597: Calling groups_inventory to load vars for managed_node1 32935 1726853723.90600: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853723.90611: Calling all_plugins_play to load vars for managed_node1 32935 1726853723.90614: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853723.90617: Calling groups_plugins_play to load vars for managed_node1 32935 1726853723.90977: done sending task result for task 02083763-bbaf-84df-441d-000000000225 32935 1726853723.90980: WORKER PROCESS EXITING 32935 1726853723.91001: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853723.91321: done with get_vars() 32935 1726853723.91422: done getting variables TASK [Include the task 'assert_device_present.yml'] **************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_vlan_mtu.yml:16 Friday 20 September 2024 13:35:23 -0400 (0:00:00.027) 0:00:09.051 ****** 32935 1726853723.91539: entering _queue_task() for managed_node1/include_tasks 32935 1726853723.92124: worker is 1 (out of 1 available) 32935 1726853723.92138: exiting _queue_task() for managed_node1/include_tasks 32935 1726853723.92150: done queuing things up, now waiting for results queue to drain 32935 1726853723.92152: waiting for pending results... 32935 1726853723.92461: running TaskExecutor() for managed_node1/TASK: Include the task 'assert_device_present.yml' 32935 1726853723.92657: in run() - task 02083763-bbaf-84df-441d-00000000000d 32935 1726853723.92676: variable 'ansible_search_path' from source: unknown 32935 1726853723.92947: calling self._execute() 32935 1726853723.93022: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853723.93026: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853723.93080: variable 'omit' from source: magic vars 32935 1726853723.93886: variable 'ansible_distribution_major_version' from source: facts 32935 1726853723.93898: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853723.93924: _execute() done 32935 1726853723.93927: dumping result to json 32935 1726853723.93930: done dumping result, returning 32935 1726853723.93938: done running TaskExecutor() for managed_node1/TASK: Include the task 'assert_device_present.yml' [02083763-bbaf-84df-441d-00000000000d] 32935 1726853723.93948: sending task result for task 02083763-bbaf-84df-441d-00000000000d 32935 1726853723.94085: no more pending results, returning what we have 32935 1726853723.94090: in VariableManager get_vars() 32935 1726853723.94140: Calling all_inventory to load vars for managed_node1 32935 1726853723.94143: Calling groups_inventory to load vars for managed_node1 32935 1726853723.94145: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853723.94158: Calling all_plugins_play to load vars for managed_node1 32935 1726853723.94161: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853723.94164: Calling groups_plugins_play to load vars for managed_node1 32935 1726853723.94538: done sending task result for task 02083763-bbaf-84df-441d-00000000000d 32935 1726853723.94542: WORKER PROCESS EXITING 32935 1726853723.94567: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853723.95038: done with get_vars() 32935 1726853723.95047: variable 'ansible_search_path' from source: unknown 32935 1726853723.95060: we have included files to process 32935 1726853723.95061: generating all_blocks data 32935 1726853723.95063: done generating all_blocks data 32935 1726853723.95068: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 32935 1726853723.95070: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 32935 1726853723.95074: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 32935 1726853723.95343: in VariableManager get_vars() 32935 1726853723.95483: done with get_vars() 32935 1726853723.95722: done processing included file 32935 1726853723.95724: iterating over new_blocks loaded from include file 32935 1726853723.95725: in VariableManager get_vars() 32935 1726853723.95738: done with get_vars() 32935 1726853723.95740: filtering new block on tags 32935 1726853723.95754: done filtering new block on tags 32935 1726853723.95756: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed_node1 32935 1726853723.95760: extending task lists for all hosts with included blocks 32935 1726853724.00974: done extending task lists 32935 1726853724.00977: done processing included files 32935 1726853724.00978: results queue empty 32935 1726853724.00979: checking for any_errors_fatal 32935 1726853724.00982: done checking for any_errors_fatal 32935 1726853724.00983: checking for max_fail_percentage 32935 1726853724.00984: done checking for max_fail_percentage 32935 1726853724.00985: checking to see if all hosts have failed and the running result is not ok 32935 1726853724.00986: done checking to see if all hosts have failed 32935 1726853724.00986: getting the remaining hosts for this loop 32935 1726853724.00988: done getting the remaining hosts for this loop 32935 1726853724.00991: getting the next task for host managed_node1 32935 1726853724.00996: done getting next task for host managed_node1 32935 1726853724.00998: ^ task is: TASK: Include the task 'get_interface_stat.yml' 32935 1726853724.01001: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853724.01004: getting variables 32935 1726853724.01005: in VariableManager get_vars() 32935 1726853724.01026: Calling all_inventory to load vars for managed_node1 32935 1726853724.01030: Calling groups_inventory to load vars for managed_node1 32935 1726853724.01032: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853724.01039: Calling all_plugins_play to load vars for managed_node1 32935 1726853724.01041: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853724.01044: Calling groups_plugins_play to load vars for managed_node1 32935 1726853724.01830: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853724.02249: done with get_vars() 32935 1726853724.02259: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 13:35:24 -0400 (0:00:00.109) 0:00:09.160 ****** 32935 1726853724.02451: entering _queue_task() for managed_node1/include_tasks 32935 1726853724.03230: worker is 1 (out of 1 available) 32935 1726853724.03241: exiting _queue_task() for managed_node1/include_tasks 32935 1726853724.03251: done queuing things up, now waiting for results queue to drain 32935 1726853724.03253: waiting for pending results... 32935 1726853724.03589: running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' 32935 1726853724.03921: in run() - task 02083763-bbaf-84df-441d-00000000038b 32935 1726853724.03996: variable 'ansible_search_path' from source: unknown 32935 1726853724.04005: variable 'ansible_search_path' from source: unknown 32935 1726853724.04044: calling self._execute() 32935 1726853724.04279: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853724.04290: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853724.04311: variable 'omit' from source: magic vars 32935 1726853724.05126: variable 'ansible_distribution_major_version' from source: facts 32935 1726853724.05144: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853724.05156: _execute() done 32935 1726853724.05165: dumping result to json 32935 1726853724.05378: done dumping result, returning 32935 1726853724.05381: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' [02083763-bbaf-84df-441d-00000000038b] 32935 1726853724.05384: sending task result for task 02083763-bbaf-84df-441d-00000000038b 32935 1726853724.05461: done sending task result for task 02083763-bbaf-84df-441d-00000000038b 32935 1726853724.05464: WORKER PROCESS EXITING 32935 1726853724.05510: no more pending results, returning what we have 32935 1726853724.05516: in VariableManager get_vars() 32935 1726853724.05569: Calling all_inventory to load vars for managed_node1 32935 1726853724.05574: Calling groups_inventory to load vars for managed_node1 32935 1726853724.05577: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853724.05592: Calling all_plugins_play to load vars for managed_node1 32935 1726853724.05596: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853724.05599: Calling groups_plugins_play to load vars for managed_node1 32935 1726853724.06227: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853724.06502: done with get_vars() 32935 1726853724.06510: variable 'ansible_search_path' from source: unknown 32935 1726853724.06511: variable 'ansible_search_path' from source: unknown 32935 1726853724.06551: we have included files to process 32935 1726853724.06552: generating all_blocks data 32935 1726853724.06554: done generating all_blocks data 32935 1726853724.06555: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 32935 1726853724.06557: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 32935 1726853724.06559: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 32935 1726853724.06797: done processing included file 32935 1726853724.06800: iterating over new_blocks loaded from include file 32935 1726853724.06802: in VariableManager get_vars() 32935 1726853724.06820: done with get_vars() 32935 1726853724.06821: filtering new block on tags 32935 1726853724.06836: done filtering new block on tags 32935 1726853724.06838: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node1 32935 1726853724.06843: extending task lists for all hosts with included blocks 32935 1726853724.06943: done extending task lists 32935 1726853724.06945: done processing included files 32935 1726853724.06945: results queue empty 32935 1726853724.06946: checking for any_errors_fatal 32935 1726853724.06949: done checking for any_errors_fatal 32935 1726853724.06950: checking for max_fail_percentage 32935 1726853724.06951: done checking for max_fail_percentage 32935 1726853724.06951: checking to see if all hosts have failed and the running result is not ok 32935 1726853724.06952: done checking to see if all hosts have failed 32935 1726853724.06953: getting the remaining hosts for this loop 32935 1726853724.06954: done getting the remaining hosts for this loop 32935 1726853724.06957: getting the next task for host managed_node1 32935 1726853724.06966: done getting next task for host managed_node1 32935 1726853724.06968: ^ task is: TASK: Get stat for interface {{ interface }} 32935 1726853724.06972: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853724.06974: getting variables 32935 1726853724.06976: in VariableManager get_vars() 32935 1726853724.06989: Calling all_inventory to load vars for managed_node1 32935 1726853724.06991: Calling groups_inventory to load vars for managed_node1 32935 1726853724.06993: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853724.06998: Calling all_plugins_play to load vars for managed_node1 32935 1726853724.07000: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853724.07003: Calling groups_plugins_play to load vars for managed_node1 32935 1726853724.07144: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853724.07338: done with get_vars() 32935 1726853724.07346: done getting variables 32935 1726853724.07499: variable 'interface' from source: play vars TASK [Get stat for interface lsr101] ******************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 13:35:24 -0400 (0:00:00.050) 0:00:09.210 ****** 32935 1726853724.07532: entering _queue_task() for managed_node1/stat 32935 1726853724.07870: worker is 1 (out of 1 available) 32935 1726853724.07883: exiting _queue_task() for managed_node1/stat 32935 1726853724.07893: done queuing things up, now waiting for results queue to drain 32935 1726853724.07894: waiting for pending results... 32935 1726853724.08176: running TaskExecutor() for managed_node1/TASK: Get stat for interface lsr101 32935 1726853724.08247: in run() - task 02083763-bbaf-84df-441d-0000000004a4 32935 1726853724.08381: variable 'ansible_search_path' from source: unknown 32935 1726853724.08385: variable 'ansible_search_path' from source: unknown 32935 1726853724.08389: calling self._execute() 32935 1726853724.08506: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853724.08517: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853724.08533: variable 'omit' from source: magic vars 32935 1726853724.09474: variable 'ansible_distribution_major_version' from source: facts 32935 1726853724.09493: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853724.09505: variable 'omit' from source: magic vars 32935 1726853724.09597: variable 'omit' from source: magic vars 32935 1726853724.09862: variable 'interface' from source: play vars 32935 1726853724.09878: variable 'omit' from source: magic vars 32935 1726853724.09925: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32935 1726853724.10111: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32935 1726853724.10115: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32935 1726853724.10134: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853724.10148: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853724.10184: variable 'inventory_hostname' from source: host vars for 'managed_node1' 32935 1726853724.10227: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853724.10283: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853724.10508: Set connection var ansible_timeout to 10 32935 1726853724.10519: Set connection var ansible_shell_type to sh 32935 1726853724.10530: Set connection var ansible_pipelining to False 32935 1726853724.10536: Set connection var ansible_connection to ssh 32935 1726853724.10550: Set connection var ansible_shell_executable to /bin/sh 32935 1726853724.10558: Set connection var ansible_module_compression to ZIP_DEFLATED 32935 1726853724.10662: variable 'ansible_shell_executable' from source: unknown 32935 1726853724.10672: variable 'ansible_connection' from source: unknown 32935 1726853724.10681: variable 'ansible_module_compression' from source: unknown 32935 1726853724.10689: variable 'ansible_shell_type' from source: unknown 32935 1726853724.10763: variable 'ansible_shell_executable' from source: unknown 32935 1726853724.10766: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853724.10768: variable 'ansible_pipelining' from source: unknown 32935 1726853724.10770: variable 'ansible_timeout' from source: unknown 32935 1726853724.10773: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853724.11121: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 32935 1726853724.11142: variable 'omit' from source: magic vars 32935 1726853724.11241: starting attempt loop 32935 1726853724.11244: running the handler 32935 1726853724.11247: _low_level_execute_command(): starting 32935 1726853724.11249: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32935 1726853724.12908: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32935 1726853724.12991: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853724.13003: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853724.13027: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853724.13106: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853724.14802: stdout chunk (state=3): >>>/root <<< 32935 1726853724.14969: stdout chunk (state=3): >>><<< 32935 1726853724.14976: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853724.14979: stderr chunk (state=3): >>><<< 32935 1726853724.15006: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853724.15177: _low_level_execute_command(): starting 32935 1726853724.15182: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853724.1501343-33437-194869835664816 `" && echo ansible-tmp-1726853724.1501343-33437-194869835664816="` echo /root/.ansible/tmp/ansible-tmp-1726853724.1501343-33437-194869835664816 `" ) && sleep 0' 32935 1726853724.16310: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32935 1726853724.16482: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853724.16794: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853724.16798: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853724.18718: stdout chunk (state=3): >>>ansible-tmp-1726853724.1501343-33437-194869835664816=/root/.ansible/tmp/ansible-tmp-1726853724.1501343-33437-194869835664816 <<< 32935 1726853724.18817: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853724.18880: stderr chunk (state=3): >>><<< 32935 1726853724.18893: stdout chunk (state=3): >>><<< 32935 1726853724.18923: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853724.1501343-33437-194869835664816=/root/.ansible/tmp/ansible-tmp-1726853724.1501343-33437-194869835664816 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853724.18988: variable 'ansible_module_compression' from source: unknown 32935 1726853724.19046: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-32935vj31k4ae/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 32935 1726853724.19092: variable 'ansible_facts' from source: unknown 32935 1726853724.19265: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853724.1501343-33437-194869835664816/AnsiballZ_stat.py 32935 1726853724.19410: Sending initial data 32935 1726853724.19418: Sent initial data (153 bytes) 32935 1726853724.19933: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32935 1726853724.19947: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853724.20081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853724.20097: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853724.20116: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853724.20128: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853724.20199: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853724.21773: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32935 1726853724.21861: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32935 1726853724.21894: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32935vj31k4ae/tmphhenvnkj /root/.ansible/tmp/ansible-tmp-1726853724.1501343-33437-194869835664816/AnsiballZ_stat.py <<< 32935 1726853724.21946: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853724.1501343-33437-194869835664816/AnsiballZ_stat.py" <<< 32935 1726853724.21988: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-32935vj31k4ae/tmphhenvnkj" to remote "/root/.ansible/tmp/ansible-tmp-1726853724.1501343-33437-194869835664816/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853724.1501343-33437-194869835664816/AnsiballZ_stat.py" <<< 32935 1726853724.22763: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853724.22857: stderr chunk (state=3): >>><<< 32935 1726853724.22861: stdout chunk (state=3): >>><<< 32935 1726853724.22874: done transferring module to remote 32935 1726853724.22892: _low_level_execute_command(): starting 32935 1726853724.22906: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853724.1501343-33437-194869835664816/ /root/.ansible/tmp/ansible-tmp-1726853724.1501343-33437-194869835664816/AnsiballZ_stat.py && sleep 0' 32935 1726853724.23586: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32935 1726853724.23676: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853724.23790: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853724.24054: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853724.25849: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853724.25852: stdout chunk (state=3): >>><<< 32935 1726853724.25855: stderr chunk (state=3): >>><<< 32935 1726853724.25962: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853724.25973: _low_level_execute_command(): starting 32935 1726853724.25976: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853724.1501343-33437-194869835664816/AnsiballZ_stat.py && sleep 0' 32935 1726853724.26541: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32935 1726853724.26555: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853724.26575: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853724.26593: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32935 1726853724.26610: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 32935 1726853724.26687: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853724.26809: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853724.26867: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853724.26960: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853724.26977: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853724.42135: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/lsr101", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 30552, "dev": 23, "nlink": 1, "atime": 1726853722.2975543, "mtime": 1726853722.2975543, "ctime": 1726853722.2975543, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/lsr101", "lnk_target": "../../devices/virtual/net/lsr101", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/lsr101", "follow": false, "checksum_algorithm": "sha1"}}} <<< 32935 1726853724.43577: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 32935 1726853724.43581: stdout chunk (state=3): >>><<< 32935 1726853724.43583: stderr chunk (state=3): >>><<< 32935 1726853724.43586: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/lsr101", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 30552, "dev": 23, "nlink": 1, "atime": 1726853722.2975543, "mtime": 1726853722.2975543, "ctime": 1726853722.2975543, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/lsr101", "lnk_target": "../../devices/virtual/net/lsr101", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/lsr101", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 32935 1726853724.43606: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/lsr101', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853724.1501343-33437-194869835664816/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32935 1726853724.43615: _low_level_execute_command(): starting 32935 1726853724.43618: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853724.1501343-33437-194869835664816/ > /dev/null 2>&1 && sleep 0' 32935 1726853724.44083: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853724.44086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 32935 1726853724.44089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853724.44091: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853724.44101: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853724.44138: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853724.44152: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853724.44197: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853724.46025: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853724.46052: stderr chunk (state=3): >>><<< 32935 1726853724.46055: stdout chunk (state=3): >>><<< 32935 1726853724.46077: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853724.46083: handler run complete 32935 1726853724.46113: attempt loop complete, returning result 32935 1726853724.46116: _execute() done 32935 1726853724.46118: dumping result to json 32935 1726853724.46124: done dumping result, returning 32935 1726853724.46131: done running TaskExecutor() for managed_node1/TASK: Get stat for interface lsr101 [02083763-bbaf-84df-441d-0000000004a4] 32935 1726853724.46135: sending task result for task 02083763-bbaf-84df-441d-0000000004a4 32935 1726853724.46246: done sending task result for task 02083763-bbaf-84df-441d-0000000004a4 32935 1726853724.46249: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "atime": 1726853722.2975543, "block_size": 4096, "blocks": 0, "ctime": 1726853722.2975543, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 30552, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/lsr101", "lnk_target": "../../devices/virtual/net/lsr101", "mode": "0777", "mtime": 1726853722.2975543, "nlink": 1, "path": "/sys/class/net/lsr101", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 32935 1726853724.46347: no more pending results, returning what we have 32935 1726853724.46350: results queue empty 32935 1726853724.46352: checking for any_errors_fatal 32935 1726853724.46353: done checking for any_errors_fatal 32935 1726853724.46353: checking for max_fail_percentage 32935 1726853724.46355: done checking for max_fail_percentage 32935 1726853724.46356: checking to see if all hosts have failed and the running result is not ok 32935 1726853724.46357: done checking to see if all hosts have failed 32935 1726853724.46360: getting the remaining hosts for this loop 32935 1726853724.46362: done getting the remaining hosts for this loop 32935 1726853724.46366: getting the next task for host managed_node1 32935 1726853724.46383: done getting next task for host managed_node1 32935 1726853724.46385: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 32935 1726853724.46388: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853724.46392: getting variables 32935 1726853724.46393: in VariableManager get_vars() 32935 1726853724.46501: Calling all_inventory to load vars for managed_node1 32935 1726853724.46504: Calling groups_inventory to load vars for managed_node1 32935 1726853724.46506: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853724.46514: Calling all_plugins_play to load vars for managed_node1 32935 1726853724.46516: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853724.46518: Calling groups_plugins_play to load vars for managed_node1 32935 1726853724.46621: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853724.46755: done with get_vars() 32935 1726853724.46763: done getting variables 32935 1726853724.46863: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 32935 1726853724.46957: variable 'interface' from source: play vars TASK [Assert that the interface is present - 'lsr101'] ************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 13:35:24 -0400 (0:00:00.394) 0:00:09.605 ****** 32935 1726853724.46988: entering _queue_task() for managed_node1/assert 32935 1726853724.46990: Creating lock for assert 32935 1726853724.47207: worker is 1 (out of 1 available) 32935 1726853724.47221: exiting _queue_task() for managed_node1/assert 32935 1726853724.47234: done queuing things up, now waiting for results queue to drain 32935 1726853724.47236: waiting for pending results... 32935 1726853724.47528: running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'lsr101' 32935 1726853724.47591: in run() - task 02083763-bbaf-84df-441d-00000000038c 32935 1726853724.47600: variable 'ansible_search_path' from source: unknown 32935 1726853724.47604: variable 'ansible_search_path' from source: unknown 32935 1726853724.47654: calling self._execute() 32935 1726853724.47758: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853724.47762: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853724.47766: variable 'omit' from source: magic vars 32935 1726853724.48090: variable 'ansible_distribution_major_version' from source: facts 32935 1726853724.48094: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853724.48097: variable 'omit' from source: magic vars 32935 1726853724.48141: variable 'omit' from source: magic vars 32935 1726853724.48241: variable 'interface' from source: play vars 32935 1726853724.48262: variable 'omit' from source: magic vars 32935 1726853724.48303: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32935 1726853724.48329: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32935 1726853724.48372: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32935 1726853724.48381: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853724.48390: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853724.48413: variable 'inventory_hostname' from source: host vars for 'managed_node1' 32935 1726853724.48416: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853724.48419: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853724.48517: Set connection var ansible_timeout to 10 32935 1726853724.48522: Set connection var ansible_shell_type to sh 32935 1726853724.48529: Set connection var ansible_pipelining to False 32935 1726853724.48532: Set connection var ansible_connection to ssh 32935 1726853724.48536: Set connection var ansible_shell_executable to /bin/sh 32935 1726853724.48541: Set connection var ansible_module_compression to ZIP_DEFLATED 32935 1726853724.48572: variable 'ansible_shell_executable' from source: unknown 32935 1726853724.48591: variable 'ansible_connection' from source: unknown 32935 1726853724.48595: variable 'ansible_module_compression' from source: unknown 32935 1726853724.48616: variable 'ansible_shell_type' from source: unknown 32935 1726853724.48619: variable 'ansible_shell_executable' from source: unknown 32935 1726853724.48622: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853724.48624: variable 'ansible_pipelining' from source: unknown 32935 1726853724.48627: variable 'ansible_timeout' from source: unknown 32935 1726853724.48629: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853724.48732: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32935 1726853724.48743: variable 'omit' from source: magic vars 32935 1726853724.48746: starting attempt loop 32935 1726853724.48749: running the handler 32935 1726853724.48903: variable 'interface_stat' from source: set_fact 32935 1726853724.48906: Evaluated conditional (interface_stat.stat.exists): True 32935 1726853724.48908: handler run complete 32935 1726853724.48922: attempt loop complete, returning result 32935 1726853724.48929: _execute() done 32935 1726853724.48932: dumping result to json 32935 1726853724.48934: done dumping result, returning 32935 1726853724.48937: done running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'lsr101' [02083763-bbaf-84df-441d-00000000038c] 32935 1726853724.48940: sending task result for task 02083763-bbaf-84df-441d-00000000038c 32935 1726853724.49045: done sending task result for task 02083763-bbaf-84df-441d-00000000038c 32935 1726853724.49049: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 32935 1726853724.49098: no more pending results, returning what we have 32935 1726853724.49101: results queue empty 32935 1726853724.49102: checking for any_errors_fatal 32935 1726853724.49110: done checking for any_errors_fatal 32935 1726853724.49110: checking for max_fail_percentage 32935 1726853724.49112: done checking for max_fail_percentage 32935 1726853724.49113: checking to see if all hosts have failed and the running result is not ok 32935 1726853724.49114: done checking to see if all hosts have failed 32935 1726853724.49115: getting the remaining hosts for this loop 32935 1726853724.49116: done getting the remaining hosts for this loop 32935 1726853724.49119: getting the next task for host managed_node1 32935 1726853724.49126: done getting next task for host managed_node1 32935 1726853724.49128: ^ task is: TASK: TEST: I can configure the MTU for a vlan interface without autoconnect. 32935 1726853724.49129: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853724.49133: getting variables 32935 1726853724.49134: in VariableManager get_vars() 32935 1726853724.49182: Calling all_inventory to load vars for managed_node1 32935 1726853724.49188: Calling groups_inventory to load vars for managed_node1 32935 1726853724.49190: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853724.49199: Calling all_plugins_play to load vars for managed_node1 32935 1726853724.49201: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853724.49204: Calling groups_plugins_play to load vars for managed_node1 32935 1726853724.49403: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853724.49590: done with get_vars() 32935 1726853724.49597: done getting variables 32935 1726853724.49635: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [TEST: I can configure the MTU for a vlan interface without autoconnect.] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_vlan_mtu.yml:18 Friday 20 September 2024 13:35:24 -0400 (0:00:00.026) 0:00:09.632 ****** 32935 1726853724.49654: entering _queue_task() for managed_node1/debug 32935 1726853724.49852: worker is 1 (out of 1 available) 32935 1726853724.49865: exiting _queue_task() for managed_node1/debug 32935 1726853724.49878: done queuing things up, now waiting for results queue to drain 32935 1726853724.49880: waiting for pending results... 32935 1726853724.50048: running TaskExecutor() for managed_node1/TASK: TEST: I can configure the MTU for a vlan interface without autoconnect. 32935 1726853724.50117: in run() - task 02083763-bbaf-84df-441d-00000000000e 32935 1726853724.50125: variable 'ansible_search_path' from source: unknown 32935 1726853724.50153: calling self._execute() 32935 1726853724.50254: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853724.50257: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853724.50266: variable 'omit' from source: magic vars 32935 1726853724.50539: variable 'ansible_distribution_major_version' from source: facts 32935 1726853724.50555: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853724.50558: variable 'omit' from source: magic vars 32935 1726853724.50573: variable 'omit' from source: magic vars 32935 1726853724.50597: variable 'omit' from source: magic vars 32935 1726853724.50627: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32935 1726853724.50664: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32935 1726853724.50676: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32935 1726853724.50690: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853724.50698: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853724.50721: variable 'inventory_hostname' from source: host vars for 'managed_node1' 32935 1726853724.50724: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853724.50727: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853724.50962: Set connection var ansible_timeout to 10 32935 1726853724.50973: Set connection var ansible_shell_type to sh 32935 1726853724.51013: Set connection var ansible_pipelining to False 32935 1726853724.51016: Set connection var ansible_connection to ssh 32935 1726853724.51019: Set connection var ansible_shell_executable to /bin/sh 32935 1726853724.51021: Set connection var ansible_module_compression to ZIP_DEFLATED 32935 1726853724.51026: variable 'ansible_shell_executable' from source: unknown 32935 1726853724.51029: variable 'ansible_connection' from source: unknown 32935 1726853724.51032: variable 'ansible_module_compression' from source: unknown 32935 1726853724.51035: variable 'ansible_shell_type' from source: unknown 32935 1726853724.51039: variable 'ansible_shell_executable' from source: unknown 32935 1726853724.51041: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853724.51046: variable 'ansible_pipelining' from source: unknown 32935 1726853724.51048: variable 'ansible_timeout' from source: unknown 32935 1726853724.51052: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853724.51167: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32935 1726853724.51211: variable 'omit' from source: magic vars 32935 1726853724.51214: starting attempt loop 32935 1726853724.51217: running the handler 32935 1726853724.51274: handler run complete 32935 1726853724.51278: attempt loop complete, returning result 32935 1726853724.51280: _execute() done 32935 1726853724.51283: dumping result to json 32935 1726853724.51285: done dumping result, returning 32935 1726853724.51288: done running TaskExecutor() for managed_node1/TASK: TEST: I can configure the MTU for a vlan interface without autoconnect. [02083763-bbaf-84df-441d-00000000000e] 32935 1726853724.51290: sending task result for task 02083763-bbaf-84df-441d-00000000000e 32935 1726853724.51370: done sending task result for task 02083763-bbaf-84df-441d-00000000000e 32935 1726853724.51374: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: ################################################## 32935 1726853724.51421: no more pending results, returning what we have 32935 1726853724.51424: results queue empty 32935 1726853724.51425: checking for any_errors_fatal 32935 1726853724.51431: done checking for any_errors_fatal 32935 1726853724.51432: checking for max_fail_percentage 32935 1726853724.51434: done checking for max_fail_percentage 32935 1726853724.51435: checking to see if all hosts have failed and the running result is not ok 32935 1726853724.51436: done checking to see if all hosts have failed 32935 1726853724.51437: getting the remaining hosts for this loop 32935 1726853724.51439: done getting the remaining hosts for this loop 32935 1726853724.51442: getting the next task for host managed_node1 32935 1726853724.51450: done getting next task for host managed_node1 32935 1726853724.51455: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 32935 1726853724.51458: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853724.51474: getting variables 32935 1726853724.51476: in VariableManager get_vars() 32935 1726853724.51512: Calling all_inventory to load vars for managed_node1 32935 1726853724.51514: Calling groups_inventory to load vars for managed_node1 32935 1726853724.51516: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853724.51524: Calling all_plugins_play to load vars for managed_node1 32935 1726853724.51526: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853724.51528: Calling groups_plugins_play to load vars for managed_node1 32935 1726853724.51674: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853724.51907: done with get_vars() 32935 1726853724.51917: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 13:35:24 -0400 (0:00:00.023) 0:00:09.655 ****** 32935 1726853724.51994: entering _queue_task() for managed_node1/include_tasks 32935 1726853724.52215: worker is 1 (out of 1 available) 32935 1726853724.52235: exiting _queue_task() for managed_node1/include_tasks 32935 1726853724.52247: done queuing things up, now waiting for results queue to drain 32935 1726853724.52249: waiting for pending results... 32935 1726853724.52604: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 32935 1726853724.52621: in run() - task 02083763-bbaf-84df-441d-000000000016 32935 1726853724.52631: variable 'ansible_search_path' from source: unknown 32935 1726853724.52634: variable 'ansible_search_path' from source: unknown 32935 1726853724.52665: calling self._execute() 32935 1726853724.52729: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853724.52732: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853724.52741: variable 'omit' from source: magic vars 32935 1726853724.52996: variable 'ansible_distribution_major_version' from source: facts 32935 1726853724.53006: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853724.53012: _execute() done 32935 1726853724.53015: dumping result to json 32935 1726853724.53020: done dumping result, returning 32935 1726853724.53030: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [02083763-bbaf-84df-441d-000000000016] 32935 1726853724.53066: sending task result for task 02083763-bbaf-84df-441d-000000000016 32935 1726853724.53206: done sending task result for task 02083763-bbaf-84df-441d-000000000016 32935 1726853724.53209: WORKER PROCESS EXITING 32935 1726853724.53255: no more pending results, returning what we have 32935 1726853724.53259: in VariableManager get_vars() 32935 1726853724.53297: Calling all_inventory to load vars for managed_node1 32935 1726853724.53299: Calling groups_inventory to load vars for managed_node1 32935 1726853724.53302: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853724.53310: Calling all_plugins_play to load vars for managed_node1 32935 1726853724.53312: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853724.53315: Calling groups_plugins_play to load vars for managed_node1 32935 1726853724.53554: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853724.53755: done with get_vars() 32935 1726853724.53769: variable 'ansible_search_path' from source: unknown 32935 1726853724.53773: variable 'ansible_search_path' from source: unknown 32935 1726853724.53810: we have included files to process 32935 1726853724.53811: generating all_blocks data 32935 1726853724.53813: done generating all_blocks data 32935 1726853724.53816: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 32935 1726853724.53817: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 32935 1726853724.53819: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 32935 1726853724.54560: done processing included file 32935 1726853724.54562: iterating over new_blocks loaded from include file 32935 1726853724.54564: in VariableManager get_vars() 32935 1726853724.54592: done with get_vars() 32935 1726853724.54597: filtering new block on tags 32935 1726853724.54614: done filtering new block on tags 32935 1726853724.54617: in VariableManager get_vars() 32935 1726853724.54640: done with get_vars() 32935 1726853724.54642: filtering new block on tags 32935 1726853724.54669: done filtering new block on tags 32935 1726853724.54675: in VariableManager get_vars() 32935 1726853724.54697: done with get_vars() 32935 1726853724.54699: filtering new block on tags 32935 1726853724.54721: done filtering new block on tags 32935 1726853724.54724: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node1 32935 1726853724.54729: extending task lists for all hosts with included blocks 32935 1726853724.55312: done extending task lists 32935 1726853724.55313: done processing included files 32935 1726853724.55314: results queue empty 32935 1726853724.55314: checking for any_errors_fatal 32935 1726853724.55316: done checking for any_errors_fatal 32935 1726853724.55317: checking for max_fail_percentage 32935 1726853724.55317: done checking for max_fail_percentage 32935 1726853724.55318: checking to see if all hosts have failed and the running result is not ok 32935 1726853724.55318: done checking to see if all hosts have failed 32935 1726853724.55319: getting the remaining hosts for this loop 32935 1726853724.55319: done getting the remaining hosts for this loop 32935 1726853724.55321: getting the next task for host managed_node1 32935 1726853724.55324: done getting next task for host managed_node1 32935 1726853724.55326: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 32935 1726853724.55328: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853724.55334: getting variables 32935 1726853724.55335: in VariableManager get_vars() 32935 1726853724.55344: Calling all_inventory to load vars for managed_node1 32935 1726853724.55346: Calling groups_inventory to load vars for managed_node1 32935 1726853724.55347: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853724.55350: Calling all_plugins_play to load vars for managed_node1 32935 1726853724.55352: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853724.55354: Calling groups_plugins_play to load vars for managed_node1 32935 1726853724.55464: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853724.55582: done with get_vars() 32935 1726853724.55590: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 13:35:24 -0400 (0:00:00.036) 0:00:09.692 ****** 32935 1726853724.55637: entering _queue_task() for managed_node1/setup 32935 1726853724.55859: worker is 1 (out of 1 available) 32935 1726853724.55876: exiting _queue_task() for managed_node1/setup 32935 1726853724.55888: done queuing things up, now waiting for results queue to drain 32935 1726853724.55890: waiting for pending results... 32935 1726853724.56053: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 32935 1726853724.56153: in run() - task 02083763-bbaf-84df-441d-0000000004bf 32935 1726853724.56166: variable 'ansible_search_path' from source: unknown 32935 1726853724.56169: variable 'ansible_search_path' from source: unknown 32935 1726853724.56199: calling self._execute() 32935 1726853724.56260: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853724.56266: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853724.56276: variable 'omit' from source: magic vars 32935 1726853724.56536: variable 'ansible_distribution_major_version' from source: facts 32935 1726853724.56547: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853724.56698: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32935 1726853724.58332: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32935 1726853724.58576: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32935 1726853724.58580: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32935 1726853724.58582: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32935 1726853724.58585: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32935 1726853724.58587: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32935 1726853724.58600: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32935 1726853724.58625: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853724.58667: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32935 1726853724.58687: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32935 1726853724.58740: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32935 1726853724.58779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32935 1726853724.58808: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853724.58849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32935 1726853724.58872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32935 1726853724.59029: variable '__network_required_facts' from source: role '' defaults 32935 1726853724.59045: variable 'ansible_facts' from source: unknown 32935 1726853724.59139: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 32935 1726853724.59148: when evaluation is False, skipping this task 32935 1726853724.59156: _execute() done 32935 1726853724.59167: dumping result to json 32935 1726853724.59177: done dumping result, returning 32935 1726853724.59190: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [02083763-bbaf-84df-441d-0000000004bf] 32935 1726853724.59199: sending task result for task 02083763-bbaf-84df-441d-0000000004bf skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 32935 1726853724.59342: no more pending results, returning what we have 32935 1726853724.59345: results queue empty 32935 1726853724.59346: checking for any_errors_fatal 32935 1726853724.59347: done checking for any_errors_fatal 32935 1726853724.59348: checking for max_fail_percentage 32935 1726853724.59349: done checking for max_fail_percentage 32935 1726853724.59350: checking to see if all hosts have failed and the running result is not ok 32935 1726853724.59351: done checking to see if all hosts have failed 32935 1726853724.59351: getting the remaining hosts for this loop 32935 1726853724.59353: done getting the remaining hosts for this loop 32935 1726853724.59356: getting the next task for host managed_node1 32935 1726853724.59374: done getting next task for host managed_node1 32935 1726853724.59378: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 32935 1726853724.59381: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853724.59393: getting variables 32935 1726853724.59395: in VariableManager get_vars() 32935 1726853724.59433: Calling all_inventory to load vars for managed_node1 32935 1726853724.59436: Calling groups_inventory to load vars for managed_node1 32935 1726853724.59438: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853724.59447: Calling all_plugins_play to load vars for managed_node1 32935 1726853724.59450: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853724.59452: Calling groups_plugins_play to load vars for managed_node1 32935 1726853724.59816: done sending task result for task 02083763-bbaf-84df-441d-0000000004bf 32935 1726853724.59820: WORKER PROCESS EXITING 32935 1726853724.59865: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853724.60181: done with get_vars() 32935 1726853724.60192: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 13:35:24 -0400 (0:00:00.046) 0:00:09.738 ****** 32935 1726853724.60308: entering _queue_task() for managed_node1/stat 32935 1726853724.60621: worker is 1 (out of 1 available) 32935 1726853724.60636: exiting _queue_task() for managed_node1/stat 32935 1726853724.60648: done queuing things up, now waiting for results queue to drain 32935 1726853724.60650: waiting for pending results... 32935 1726853724.60844: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 32935 1726853724.60998: in run() - task 02083763-bbaf-84df-441d-0000000004c1 32935 1726853724.61016: variable 'ansible_search_path' from source: unknown 32935 1726853724.61024: variable 'ansible_search_path' from source: unknown 32935 1726853724.61069: calling self._execute() 32935 1726853724.61153: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853724.61169: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853724.61187: variable 'omit' from source: magic vars 32935 1726853724.61541: variable 'ansible_distribution_major_version' from source: facts 32935 1726853724.61562: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853724.61731: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 32935 1726853724.62025: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 32935 1726853724.62076: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 32935 1726853724.62121: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 32935 1726853724.62160: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 32935 1726853724.62251: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 32935 1726853724.62310: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 32935 1726853724.62319: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853724.62350: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 32935 1726853724.62442: variable '__network_is_ostree' from source: set_fact 32935 1726853724.62528: Evaluated conditional (not __network_is_ostree is defined): False 32935 1726853724.62531: when evaluation is False, skipping this task 32935 1726853724.62533: _execute() done 32935 1726853724.62535: dumping result to json 32935 1726853724.62537: done dumping result, returning 32935 1726853724.62540: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [02083763-bbaf-84df-441d-0000000004c1] 32935 1726853724.62542: sending task result for task 02083763-bbaf-84df-441d-0000000004c1 32935 1726853724.62613: done sending task result for task 02083763-bbaf-84df-441d-0000000004c1 32935 1726853724.62616: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 32935 1726853724.62685: no more pending results, returning what we have 32935 1726853724.62689: results queue empty 32935 1726853724.62689: checking for any_errors_fatal 32935 1726853724.62700: done checking for any_errors_fatal 32935 1726853724.62700: checking for max_fail_percentage 32935 1726853724.62702: done checking for max_fail_percentage 32935 1726853724.62703: checking to see if all hosts have failed and the running result is not ok 32935 1726853724.62704: done checking to see if all hosts have failed 32935 1726853724.62705: getting the remaining hosts for this loop 32935 1726853724.62706: done getting the remaining hosts for this loop 32935 1726853724.62709: getting the next task for host managed_node1 32935 1726853724.62716: done getting next task for host managed_node1 32935 1726853724.62719: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 32935 1726853724.62723: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853724.62736: getting variables 32935 1726853724.62738: in VariableManager get_vars() 32935 1726853724.62779: Calling all_inventory to load vars for managed_node1 32935 1726853724.62782: Calling groups_inventory to load vars for managed_node1 32935 1726853724.62784: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853724.62794: Calling all_plugins_play to load vars for managed_node1 32935 1726853724.62796: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853724.62798: Calling groups_plugins_play to load vars for managed_node1 32935 1726853724.63218: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853724.63427: done with get_vars() 32935 1726853724.63439: done getting variables 32935 1726853724.63500: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 13:35:24 -0400 (0:00:00.032) 0:00:09.771 ****** 32935 1726853724.63544: entering _queue_task() for managed_node1/set_fact 32935 1726853724.63825: worker is 1 (out of 1 available) 32935 1726853724.63840: exiting _queue_task() for managed_node1/set_fact 32935 1726853724.63854: done queuing things up, now waiting for results queue to drain 32935 1726853724.63856: waiting for pending results... 32935 1726853724.64289: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 32935 1726853724.64297: in run() - task 02083763-bbaf-84df-441d-0000000004c2 32935 1726853724.64319: variable 'ansible_search_path' from source: unknown 32935 1726853724.64327: variable 'ansible_search_path' from source: unknown 32935 1726853724.64368: calling self._execute() 32935 1726853724.64460: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853724.64475: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853724.64496: variable 'omit' from source: magic vars 32935 1726853724.64869: variable 'ansible_distribution_major_version' from source: facts 32935 1726853724.64889: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853724.65255: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 32935 1726853724.66077: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 32935 1726853724.66085: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 32935 1726853724.66130: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 32935 1726853724.66256: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 32935 1726853724.66426: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 32935 1726853724.66461: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 32935 1726853724.66522: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853724.66612: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 32935 1726853724.66816: variable '__network_is_ostree' from source: set_fact 32935 1726853724.66864: Evaluated conditional (not __network_is_ostree is defined): False 32935 1726853724.66876: when evaluation is False, skipping this task 32935 1726853724.66894: _execute() done 32935 1726853724.66926: dumping result to json 32935 1726853724.66935: done dumping result, returning 32935 1726853724.67024: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [02083763-bbaf-84df-441d-0000000004c2] 32935 1726853724.67027: sending task result for task 02083763-bbaf-84df-441d-0000000004c2 32935 1726853724.67117: done sending task result for task 02083763-bbaf-84df-441d-0000000004c2 32935 1726853724.67121: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 32935 1726853724.67181: no more pending results, returning what we have 32935 1726853724.67184: results queue empty 32935 1726853724.67186: checking for any_errors_fatal 32935 1726853724.67192: done checking for any_errors_fatal 32935 1726853724.67193: checking for max_fail_percentage 32935 1726853724.67195: done checking for max_fail_percentage 32935 1726853724.67196: checking to see if all hosts have failed and the running result is not ok 32935 1726853724.67198: done checking to see if all hosts have failed 32935 1726853724.67198: getting the remaining hosts for this loop 32935 1726853724.67200: done getting the remaining hosts for this loop 32935 1726853724.67204: getting the next task for host managed_node1 32935 1726853724.67216: done getting next task for host managed_node1 32935 1726853724.67221: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 32935 1726853724.67225: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853724.67241: getting variables 32935 1726853724.67243: in VariableManager get_vars() 32935 1726853724.67292: Calling all_inventory to load vars for managed_node1 32935 1726853724.67295: Calling groups_inventory to load vars for managed_node1 32935 1726853724.67298: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853724.67313: Calling all_plugins_play to load vars for managed_node1 32935 1726853724.67316: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853724.67320: Calling groups_plugins_play to load vars for managed_node1 32935 1726853724.67819: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853724.68024: done with get_vars() 32935 1726853724.68035: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 13:35:24 -0400 (0:00:00.045) 0:00:09.816 ****** 32935 1726853724.68134: entering _queue_task() for managed_node1/service_facts 32935 1726853724.68136: Creating lock for service_facts 32935 1726853724.68423: worker is 1 (out of 1 available) 32935 1726853724.68438: exiting _queue_task() for managed_node1/service_facts 32935 1726853724.68450: done queuing things up, now waiting for results queue to drain 32935 1726853724.68451: waiting for pending results... 32935 1726853724.69087: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running 32935 1726853724.69216: in run() - task 02083763-bbaf-84df-441d-0000000004c4 32935 1726853724.69779: variable 'ansible_search_path' from source: unknown 32935 1726853724.69783: variable 'ansible_search_path' from source: unknown 32935 1726853724.69785: calling self._execute() 32935 1726853724.69788: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853724.69790: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853724.69793: variable 'omit' from source: magic vars 32935 1726853724.70744: variable 'ansible_distribution_major_version' from source: facts 32935 1726853724.70893: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853724.70905: variable 'omit' from source: magic vars 32935 1726853724.71092: variable 'omit' from source: magic vars 32935 1726853724.71263: variable 'omit' from source: magic vars 32935 1726853724.71314: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32935 1726853724.71440: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32935 1726853724.71473: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32935 1726853724.71497: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853724.71634: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853724.71673: variable 'inventory_hostname' from source: host vars for 'managed_node1' 32935 1726853724.71685: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853724.71693: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853724.71861: Set connection var ansible_timeout to 10 32935 1726853724.71906: Set connection var ansible_shell_type to sh 32935 1726853724.71918: Set connection var ansible_pipelining to False 32935 1726853724.71950: Set connection var ansible_connection to ssh 32935 1726853724.71964: Set connection var ansible_shell_executable to /bin/sh 32935 1726853724.72014: Set connection var ansible_module_compression to ZIP_DEFLATED 32935 1726853724.72043: variable 'ansible_shell_executable' from source: unknown 32935 1726853724.72061: variable 'ansible_connection' from source: unknown 32935 1726853724.72164: variable 'ansible_module_compression' from source: unknown 32935 1726853724.72167: variable 'ansible_shell_type' from source: unknown 32935 1726853724.72170: variable 'ansible_shell_executable' from source: unknown 32935 1726853724.72174: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853724.72176: variable 'ansible_pipelining' from source: unknown 32935 1726853724.72178: variable 'ansible_timeout' from source: unknown 32935 1726853724.72180: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853724.72541: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 32935 1726853724.72605: variable 'omit' from source: magic vars 32935 1726853724.72812: starting attempt loop 32935 1726853724.72816: running the handler 32935 1726853724.72818: _low_level_execute_command(): starting 32935 1726853724.72820: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32935 1726853724.74103: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853724.74129: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853724.74297: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853724.74299: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853724.74388: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853724.74461: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853724.76136: stdout chunk (state=3): >>>/root <<< 32935 1726853724.76313: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853724.76346: stderr chunk (state=3): >>><<< 32935 1726853724.76355: stdout chunk (state=3): >>><<< 32935 1726853724.76441: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853724.76465: _low_level_execute_command(): starting 32935 1726853724.76524: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853724.7644794-33478-230807135752733 `" && echo ansible-tmp-1726853724.7644794-33478-230807135752733="` echo /root/.ansible/tmp/ansible-tmp-1726853724.7644794-33478-230807135752733 `" ) && sleep 0' 32935 1726853724.77397: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32935 1726853724.77409: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853724.77496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853724.77620: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853724.77625: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853724.77643: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853724.77713: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853724.79718: stdout chunk (state=3): >>>ansible-tmp-1726853724.7644794-33478-230807135752733=/root/.ansible/tmp/ansible-tmp-1726853724.7644794-33478-230807135752733 <<< 32935 1726853724.79769: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853724.79865: stderr chunk (state=3): >>><<< 32935 1726853724.79869: stdout chunk (state=3): >>><<< 32935 1726853724.79933: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853724.7644794-33478-230807135752733=/root/.ansible/tmp/ansible-tmp-1726853724.7644794-33478-230807135752733 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853724.80004: variable 'ansible_module_compression' from source: unknown 32935 1726853724.80052: ANSIBALLZ: Using lock for service_facts 32935 1726853724.80056: ANSIBALLZ: Acquiring lock 32935 1726853724.80061: ANSIBALLZ: Lock acquired: 140683291442304 32935 1726853724.80064: ANSIBALLZ: Creating module 32935 1726853725.00629: ANSIBALLZ: Writing module into payload 32935 1726853725.00830: ANSIBALLZ: Writing module 32935 1726853725.00834: ANSIBALLZ: Renaming module 32935 1726853725.00897: ANSIBALLZ: Done creating module 32935 1726853725.01105: variable 'ansible_facts' from source: unknown 32935 1726853725.01150: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853724.7644794-33478-230807135752733/AnsiballZ_service_facts.py 32935 1726853725.01444: Sending initial data 32935 1726853725.01452: Sent initial data (162 bytes) 32935 1726853725.02721: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853725.02794: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853725.02826: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853725.02948: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853725.04583: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 32935 1726853725.04590: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32935 1726853725.04619: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32935 1726853725.04661: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32935vj31k4ae/tmpx_8zq2hl /root/.ansible/tmp/ansible-tmp-1726853724.7644794-33478-230807135752733/AnsiballZ_service_facts.py <<< 32935 1726853725.04664: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853724.7644794-33478-230807135752733/AnsiballZ_service_facts.py" <<< 32935 1726853725.04694: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-32935vj31k4ae/tmpx_8zq2hl" to remote "/root/.ansible/tmp/ansible-tmp-1726853724.7644794-33478-230807135752733/AnsiballZ_service_facts.py" <<< 32935 1726853725.04697: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853724.7644794-33478-230807135752733/AnsiballZ_service_facts.py" <<< 32935 1726853725.05229: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853725.05272: stderr chunk (state=3): >>><<< 32935 1726853725.05276: stdout chunk (state=3): >>><<< 32935 1726853725.05311: done transferring module to remote 32935 1726853725.05320: _low_level_execute_command(): starting 32935 1726853725.05326: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853724.7644794-33478-230807135752733/ /root/.ansible/tmp/ansible-tmp-1726853724.7644794-33478-230807135752733/AnsiballZ_service_facts.py && sleep 0' 32935 1726853725.05950: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853725.05988: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853725.06023: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853725.08083: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853725.08086: stdout chunk (state=3): >>><<< 32935 1726853725.08089: stderr chunk (state=3): >>><<< 32935 1726853725.08092: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853725.08094: _low_level_execute_command(): starting 32935 1726853725.08096: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853724.7644794-33478-230807135752733/AnsiballZ_service_facts.py && sleep 0' 32935 1726853725.08628: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853725.08642: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853725.08654: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853725.08665: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853725.08716: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853725.08729: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853725.08788: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853726.63608: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-ma<<< 32935 1726853726.63632: stdout chunk (state=3): >>>rk.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "stat<<< 32935 1726853726.63670: stdout chunk (state=3): >>>us": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "nmstate.service": {"name": "nmstate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": <<< 32935 1726853726.63677: stdout chunk (state=3): >>>"systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 32935 1726853726.65205: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 32935 1726853726.65237: stderr chunk (state=3): >>><<< 32935 1726853726.65241: stdout chunk (state=3): >>><<< 32935 1726853726.65270: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "nmstate.service": {"name": "nmstate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 32935 1726853726.66487: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853724.7644794-33478-230807135752733/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32935 1726853726.66494: _low_level_execute_command(): starting 32935 1726853726.66499: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853724.7644794-33478-230807135752733/ > /dev/null 2>&1 && sleep 0' 32935 1726853726.66939: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853726.66943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32935 1726853726.66974: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 32935 1726853726.66981: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address <<< 32935 1726853726.66983: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853726.66987: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 32935 1726853726.66998: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853726.67157: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853726.67216: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853726.67256: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853726.69115: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853726.69123: stderr chunk (state=3): >>><<< 32935 1726853726.69126: stdout chunk (state=3): >>><<< 32935 1726853726.69379: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853726.69382: handler run complete 32935 1726853726.69384: variable 'ansible_facts' from source: unknown 32935 1726853726.69526: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853726.70047: variable 'ansible_facts' from source: unknown 32935 1726853726.70193: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853726.70426: attempt loop complete, returning result 32935 1726853726.70436: _execute() done 32935 1726853726.70443: dumping result to json 32935 1726853726.70520: done dumping result, returning 32935 1726853726.70534: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running [02083763-bbaf-84df-441d-0000000004c4] 32935 1726853726.70543: sending task result for task 02083763-bbaf-84df-441d-0000000004c4 32935 1726853726.71936: done sending task result for task 02083763-bbaf-84df-441d-0000000004c4 32935 1726853726.71940: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 32935 1726853726.72018: no more pending results, returning what we have 32935 1726853726.72020: results queue empty 32935 1726853726.72021: checking for any_errors_fatal 32935 1726853726.72024: done checking for any_errors_fatal 32935 1726853726.72025: checking for max_fail_percentage 32935 1726853726.72026: done checking for max_fail_percentage 32935 1726853726.72027: checking to see if all hosts have failed and the running result is not ok 32935 1726853726.72028: done checking to see if all hosts have failed 32935 1726853726.72029: getting the remaining hosts for this loop 32935 1726853726.72030: done getting the remaining hosts for this loop 32935 1726853726.72033: getting the next task for host managed_node1 32935 1726853726.72039: done getting next task for host managed_node1 32935 1726853726.72041: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 32935 1726853726.72045: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853726.72054: getting variables 32935 1726853726.72055: in VariableManager get_vars() 32935 1726853726.72086: Calling all_inventory to load vars for managed_node1 32935 1726853726.72089: Calling groups_inventory to load vars for managed_node1 32935 1726853726.72091: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853726.72106: Calling all_plugins_play to load vars for managed_node1 32935 1726853726.72109: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853726.72112: Calling groups_plugins_play to load vars for managed_node1 32935 1726853726.72482: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853726.72775: done with get_vars() 32935 1726853726.72784: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 13:35:26 -0400 (0:00:02.047) 0:00:11.864 ****** 32935 1726853726.72854: entering _queue_task() for managed_node1/package_facts 32935 1726853726.72855: Creating lock for package_facts 32935 1726853726.73097: worker is 1 (out of 1 available) 32935 1726853726.73111: exiting _queue_task() for managed_node1/package_facts 32935 1726853726.73124: done queuing things up, now waiting for results queue to drain 32935 1726853726.73126: waiting for pending results... 32935 1726853726.73301: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 32935 1726853726.73400: in run() - task 02083763-bbaf-84df-441d-0000000004c5 32935 1726853726.73410: variable 'ansible_search_path' from source: unknown 32935 1726853726.73414: variable 'ansible_search_path' from source: unknown 32935 1726853726.73443: calling self._execute() 32935 1726853726.73506: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853726.73509: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853726.73519: variable 'omit' from source: magic vars 32935 1726853726.73803: variable 'ansible_distribution_major_version' from source: facts 32935 1726853726.73813: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853726.73819: variable 'omit' from source: magic vars 32935 1726853726.73864: variable 'omit' from source: magic vars 32935 1726853726.73888: variable 'omit' from source: magic vars 32935 1726853726.73923: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32935 1726853726.73949: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32935 1726853726.73967: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32935 1726853726.73982: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853726.73991: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853726.74018: variable 'inventory_hostname' from source: host vars for 'managed_node1' 32935 1726853726.74022: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853726.74024: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853726.74092: Set connection var ansible_timeout to 10 32935 1726853726.74096: Set connection var ansible_shell_type to sh 32935 1726853726.74103: Set connection var ansible_pipelining to False 32935 1726853726.74105: Set connection var ansible_connection to ssh 32935 1726853726.74114: Set connection var ansible_shell_executable to /bin/sh 32935 1726853726.74117: Set connection var ansible_module_compression to ZIP_DEFLATED 32935 1726853726.74136: variable 'ansible_shell_executable' from source: unknown 32935 1726853726.74139: variable 'ansible_connection' from source: unknown 32935 1726853726.74142: variable 'ansible_module_compression' from source: unknown 32935 1726853726.74145: variable 'ansible_shell_type' from source: unknown 32935 1726853726.74147: variable 'ansible_shell_executable' from source: unknown 32935 1726853726.74149: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853726.74151: variable 'ansible_pipelining' from source: unknown 32935 1726853726.74153: variable 'ansible_timeout' from source: unknown 32935 1726853726.74160: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853726.74302: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 32935 1726853726.74309: variable 'omit' from source: magic vars 32935 1726853726.74314: starting attempt loop 32935 1726853726.74317: running the handler 32935 1726853726.74330: _low_level_execute_command(): starting 32935 1726853726.74342: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32935 1726853726.75164: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853726.75168: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853726.75211: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853726.76858: stdout chunk (state=3): >>>/root <<< 32935 1726853726.77019: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853726.77022: stdout chunk (state=3): >>><<< 32935 1726853726.77024: stderr chunk (state=3): >>><<< 32935 1726853726.77042: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853726.77061: _low_level_execute_command(): starting 32935 1726853726.77140: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853726.7704837-33545-34197919222885 `" && echo ansible-tmp-1726853726.7704837-33545-34197919222885="` echo /root/.ansible/tmp/ansible-tmp-1726853726.7704837-33545-34197919222885 `" ) && sleep 0' 32935 1726853726.77779: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853726.77817: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853726.77832: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853726.77907: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853726.79778: stdout chunk (state=3): >>>ansible-tmp-1726853726.7704837-33545-34197919222885=/root/.ansible/tmp/ansible-tmp-1726853726.7704837-33545-34197919222885 <<< 32935 1726853726.79929: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853726.79933: stdout chunk (state=3): >>><<< 32935 1726853726.79935: stderr chunk (state=3): >>><<< 32935 1726853726.80077: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853726.7704837-33545-34197919222885=/root/.ansible/tmp/ansible-tmp-1726853726.7704837-33545-34197919222885 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853726.80081: variable 'ansible_module_compression' from source: unknown 32935 1726853726.80083: ANSIBALLZ: Using lock for package_facts 32935 1726853726.80085: ANSIBALLZ: Acquiring lock 32935 1726853726.80087: ANSIBALLZ: Lock acquired: 140683289627536 32935 1726853726.80089: ANSIBALLZ: Creating module 32935 1726853727.20888: ANSIBALLZ: Writing module into payload 32935 1726853727.21036: ANSIBALLZ: Writing module 32935 1726853727.21153: ANSIBALLZ: Renaming module 32935 1726853727.21156: ANSIBALLZ: Done creating module 32935 1726853727.21161: variable 'ansible_facts' from source: unknown 32935 1726853727.21313: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853726.7704837-33545-34197919222885/AnsiballZ_package_facts.py 32935 1726853727.21496: Sending initial data 32935 1726853727.21505: Sent initial data (161 bytes) 32935 1726853727.22651: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32935 1726853727.22668: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853727.22725: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853727.22790: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853727.22808: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853727.22838: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853727.23048: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853727.24595: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 32935 1726853727.24610: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 32935 1726853727.24633: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32935 1726853727.24693: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32935 1726853727.24767: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32935vj31k4ae/tmpal1r3egj /root/.ansible/tmp/ansible-tmp-1726853726.7704837-33545-34197919222885/AnsiballZ_package_facts.py <<< 32935 1726853727.24798: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853726.7704837-33545-34197919222885/AnsiballZ_package_facts.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-32935vj31k4ae/tmpal1r3egj" to remote "/root/.ansible/tmp/ansible-tmp-1726853726.7704837-33545-34197919222885/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853726.7704837-33545-34197919222885/AnsiballZ_package_facts.py" <<< 32935 1726853727.27579: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853727.27585: stderr chunk (state=3): >>><<< 32935 1726853727.27588: stdout chunk (state=3): >>><<< 32935 1726853727.27590: done transferring module to remote 32935 1726853727.27592: _low_level_execute_command(): starting 32935 1726853727.27594: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853726.7704837-33545-34197919222885/ /root/.ansible/tmp/ansible-tmp-1726853726.7704837-33545-34197919222885/AnsiballZ_package_facts.py && sleep 0' 32935 1726853727.28386: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853727.28437: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853727.28465: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853727.28506: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853727.28709: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853727.30392: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853727.30395: stdout chunk (state=3): >>><<< 32935 1726853727.30400: stderr chunk (state=3): >>><<< 32935 1726853727.30420: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853727.30423: _low_level_execute_command(): starting 32935 1726853727.30428: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853726.7704837-33545-34197919222885/AnsiballZ_package_facts.py && sleep 0' 32935 1726853727.31059: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32935 1726853727.31074: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853727.31086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853727.31103: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32935 1726853727.31113: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 32935 1726853727.31120: stderr chunk (state=3): >>>debug2: match not found <<< 32935 1726853727.31139: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853727.31229: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853727.31283: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853727.31330: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853727.75413: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "r<<< 32935 1726853727.75441: stdout chunk (state=3): >>>pm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "c<<< 32935 1726853727.75463: stdout chunk (state=3): >>>loud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nmstate": [{"name": "nmstate", "version": "2.2.35", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-config-server": [{"name": "NetworkManager-config-server", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "nmstate-libs": [{"name": "nmstate-libs", "version": "2.2.35", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libnmstate": [{"name": "python3-libnmstate", "version": "2.2.35", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 32935 1726853727.77213: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853727.77242: stderr chunk (state=3): >>>Shared connection to 10.31.45.153 closed. <<< 32935 1726853727.77246: stdout chunk (state=3): >>><<< 32935 1726853727.77248: stderr chunk (state=3): >>><<< 32935 1726853727.77282: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nmstate": [{"name": "nmstate", "version": "2.2.35", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-config-server": [{"name": "NetworkManager-config-server", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "nmstate-libs": [{"name": "nmstate-libs", "version": "2.2.35", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libnmstate": [{"name": "python3-libnmstate", "version": "2.2.35", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 32935 1726853727.80277: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853726.7704837-33545-34197919222885/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32935 1726853727.80569: _low_level_execute_command(): starting 32935 1726853727.80577: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853726.7704837-33545-34197919222885/ > /dev/null 2>&1 && sleep 0' 32935 1726853727.81304: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853727.81344: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853727.81361: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853727.81385: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853727.81458: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853727.83394: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853727.83728: stderr chunk (state=3): >>><<< 32935 1726853727.83732: stdout chunk (state=3): >>><<< 32935 1726853727.83812: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853727.83816: handler run complete 32935 1726853727.85456: variable 'ansible_facts' from source: unknown 32935 1726853727.85953: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853727.87010: variable 'ansible_facts' from source: unknown 32935 1726853727.87476: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853727.88240: attempt loop complete, returning result 32935 1726853727.88251: _execute() done 32935 1726853727.88253: dumping result to json 32935 1726853727.88395: done dumping result, returning 32935 1726853727.88408: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [02083763-bbaf-84df-441d-0000000004c5] 32935 1726853727.88412: sending task result for task 02083763-bbaf-84df-441d-0000000004c5 ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 32935 1726853727.94054: done sending task result for task 02083763-bbaf-84df-441d-0000000004c5 32935 1726853727.94057: WORKER PROCESS EXITING 32935 1726853727.94080: no more pending results, returning what we have 32935 1726853727.94084: results queue empty 32935 1726853727.94085: checking for any_errors_fatal 32935 1726853727.94090: done checking for any_errors_fatal 32935 1726853727.94091: checking for max_fail_percentage 32935 1726853727.94093: done checking for max_fail_percentage 32935 1726853727.94094: checking to see if all hosts have failed and the running result is not ok 32935 1726853727.94095: done checking to see if all hosts have failed 32935 1726853727.94096: getting the remaining hosts for this loop 32935 1726853727.94097: done getting the remaining hosts for this loop 32935 1726853727.94101: getting the next task for host managed_node1 32935 1726853727.94109: done getting next task for host managed_node1 32935 1726853727.94113: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 32935 1726853727.94116: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853727.94128: getting variables 32935 1726853727.94130: in VariableManager get_vars() 32935 1726853727.94160: Calling all_inventory to load vars for managed_node1 32935 1726853727.94163: Calling groups_inventory to load vars for managed_node1 32935 1726853727.94165: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853727.94180: Calling all_plugins_play to load vars for managed_node1 32935 1726853727.94183: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853727.94186: Calling groups_plugins_play to load vars for managed_node1 32935 1726853727.95319: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853727.96707: done with get_vars() 32935 1726853727.96726: done getting variables 32935 1726853727.96775: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 13:35:27 -0400 (0:00:01.239) 0:00:13.103 ****** 32935 1726853727.96799: entering _queue_task() for managed_node1/debug 32935 1726853727.97033: worker is 1 (out of 1 available) 32935 1726853727.97048: exiting _queue_task() for managed_node1/debug 32935 1726853727.97060: done queuing things up, now waiting for results queue to drain 32935 1726853727.97062: waiting for pending results... 32935 1726853727.97234: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 32935 1726853727.97321: in run() - task 02083763-bbaf-84df-441d-000000000017 32935 1726853727.97332: variable 'ansible_search_path' from source: unknown 32935 1726853727.97335: variable 'ansible_search_path' from source: unknown 32935 1726853727.97366: calling self._execute() 32935 1726853727.97433: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853727.97438: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853727.97448: variable 'omit' from source: magic vars 32935 1726853727.97733: variable 'ansible_distribution_major_version' from source: facts 32935 1726853727.97742: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853727.97747: variable 'omit' from source: magic vars 32935 1726853727.97786: variable 'omit' from source: magic vars 32935 1726853727.97855: variable 'network_provider' from source: set_fact 32935 1726853727.97873: variable 'omit' from source: magic vars 32935 1726853727.97906: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32935 1726853727.97932: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32935 1726853727.97949: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32935 1726853727.97967: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853727.97979: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853727.98002: variable 'inventory_hostname' from source: host vars for 'managed_node1' 32935 1726853727.98005: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853727.98007: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853727.98079: Set connection var ansible_timeout to 10 32935 1726853727.98084: Set connection var ansible_shell_type to sh 32935 1726853727.98091: Set connection var ansible_pipelining to False 32935 1726853727.98093: Set connection var ansible_connection to ssh 32935 1726853727.98098: Set connection var ansible_shell_executable to /bin/sh 32935 1726853727.98103: Set connection var ansible_module_compression to ZIP_DEFLATED 32935 1726853727.98121: variable 'ansible_shell_executable' from source: unknown 32935 1726853727.98125: variable 'ansible_connection' from source: unknown 32935 1726853727.98127: variable 'ansible_module_compression' from source: unknown 32935 1726853727.98129: variable 'ansible_shell_type' from source: unknown 32935 1726853727.98132: variable 'ansible_shell_executable' from source: unknown 32935 1726853727.98134: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853727.98136: variable 'ansible_pipelining' from source: unknown 32935 1726853727.98138: variable 'ansible_timeout' from source: unknown 32935 1726853727.98143: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853727.98249: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32935 1726853727.98258: variable 'omit' from source: magic vars 32935 1726853727.98265: starting attempt loop 32935 1726853727.98270: running the handler 32935 1726853727.98307: handler run complete 32935 1726853727.98318: attempt loop complete, returning result 32935 1726853727.98321: _execute() done 32935 1726853727.98323: dumping result to json 32935 1726853727.98326: done dumping result, returning 32935 1726853727.98333: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [02083763-bbaf-84df-441d-000000000017] 32935 1726853727.98337: sending task result for task 02083763-bbaf-84df-441d-000000000017 32935 1726853727.98423: done sending task result for task 02083763-bbaf-84df-441d-000000000017 32935 1726853727.98426: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: Using network provider: nm 32935 1726853727.98483: no more pending results, returning what we have 32935 1726853727.98486: results queue empty 32935 1726853727.98487: checking for any_errors_fatal 32935 1726853727.98495: done checking for any_errors_fatal 32935 1726853727.98495: checking for max_fail_percentage 32935 1726853727.98497: done checking for max_fail_percentage 32935 1726853727.98498: checking to see if all hosts have failed and the running result is not ok 32935 1726853727.98499: done checking to see if all hosts have failed 32935 1726853727.98500: getting the remaining hosts for this loop 32935 1726853727.98501: done getting the remaining hosts for this loop 32935 1726853727.98504: getting the next task for host managed_node1 32935 1726853727.98512: done getting next task for host managed_node1 32935 1726853727.98515: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 32935 1726853727.98517: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853727.98526: getting variables 32935 1726853727.98528: in VariableManager get_vars() 32935 1726853727.98565: Calling all_inventory to load vars for managed_node1 32935 1726853727.98568: Calling groups_inventory to load vars for managed_node1 32935 1726853727.98572: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853727.98585: Calling all_plugins_play to load vars for managed_node1 32935 1726853727.98588: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853727.98591: Calling groups_plugins_play to load vars for managed_node1 32935 1726853727.99880: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853728.00751: done with get_vars() 32935 1726853728.00768: done getting variables 32935 1726853728.00810: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 13:35:28 -0400 (0:00:00.040) 0:00:13.143 ****** 32935 1726853728.00832: entering _queue_task() for managed_node1/fail 32935 1726853728.01052: worker is 1 (out of 1 available) 32935 1726853728.01066: exiting _queue_task() for managed_node1/fail 32935 1726853728.01080: done queuing things up, now waiting for results queue to drain 32935 1726853728.01082: waiting for pending results... 32935 1726853728.01250: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 32935 1726853728.01335: in run() - task 02083763-bbaf-84df-441d-000000000018 32935 1726853728.01346: variable 'ansible_search_path' from source: unknown 32935 1726853728.01351: variable 'ansible_search_path' from source: unknown 32935 1726853728.01382: calling self._execute() 32935 1726853728.01446: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853728.01450: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853728.01458: variable 'omit' from source: magic vars 32935 1726853728.01733: variable 'ansible_distribution_major_version' from source: facts 32935 1726853728.01743: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853728.01829: variable 'network_state' from source: role '' defaults 32935 1726853728.01838: Evaluated conditional (network_state != {}): False 32935 1726853728.01841: when evaluation is False, skipping this task 32935 1726853728.01844: _execute() done 32935 1726853728.01849: dumping result to json 32935 1726853728.01851: done dumping result, returning 32935 1726853728.01865: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [02083763-bbaf-84df-441d-000000000018] 32935 1726853728.01869: sending task result for task 02083763-bbaf-84df-441d-000000000018 32935 1726853728.01946: done sending task result for task 02083763-bbaf-84df-441d-000000000018 32935 1726853728.01949: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 32935 1726853728.02017: no more pending results, returning what we have 32935 1726853728.02020: results queue empty 32935 1726853728.02022: checking for any_errors_fatal 32935 1726853728.02029: done checking for any_errors_fatal 32935 1726853728.02030: checking for max_fail_percentage 32935 1726853728.02032: done checking for max_fail_percentage 32935 1726853728.02032: checking to see if all hosts have failed and the running result is not ok 32935 1726853728.02033: done checking to see if all hosts have failed 32935 1726853728.02034: getting the remaining hosts for this loop 32935 1726853728.02036: done getting the remaining hosts for this loop 32935 1726853728.02039: getting the next task for host managed_node1 32935 1726853728.02046: done getting next task for host managed_node1 32935 1726853728.02049: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 32935 1726853728.02051: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853728.02064: getting variables 32935 1726853728.02066: in VariableManager get_vars() 32935 1726853728.02102: Calling all_inventory to load vars for managed_node1 32935 1726853728.02105: Calling groups_inventory to load vars for managed_node1 32935 1726853728.02107: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853728.02115: Calling all_plugins_play to load vars for managed_node1 32935 1726853728.02117: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853728.02120: Calling groups_plugins_play to load vars for managed_node1 32935 1726853728.02864: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853728.03818: done with get_vars() 32935 1726853728.03833: done getting variables 32935 1726853728.03879: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 13:35:28 -0400 (0:00:00.030) 0:00:13.174 ****** 32935 1726853728.03904: entering _queue_task() for managed_node1/fail 32935 1726853728.04128: worker is 1 (out of 1 available) 32935 1726853728.04141: exiting _queue_task() for managed_node1/fail 32935 1726853728.04155: done queuing things up, now waiting for results queue to drain 32935 1726853728.04156: waiting for pending results... 32935 1726853728.04329: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 32935 1726853728.04415: in run() - task 02083763-bbaf-84df-441d-000000000019 32935 1726853728.04427: variable 'ansible_search_path' from source: unknown 32935 1726853728.04430: variable 'ansible_search_path' from source: unknown 32935 1726853728.04460: calling self._execute() 32935 1726853728.04525: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853728.04529: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853728.04538: variable 'omit' from source: magic vars 32935 1726853728.04807: variable 'ansible_distribution_major_version' from source: facts 32935 1726853728.04818: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853728.04900: variable 'network_state' from source: role '' defaults 32935 1726853728.04909: Evaluated conditional (network_state != {}): False 32935 1726853728.04912: when evaluation is False, skipping this task 32935 1726853728.04915: _execute() done 32935 1726853728.04918: dumping result to json 32935 1726853728.04920: done dumping result, returning 32935 1726853728.04929: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [02083763-bbaf-84df-441d-000000000019] 32935 1726853728.04932: sending task result for task 02083763-bbaf-84df-441d-000000000019 32935 1726853728.05017: done sending task result for task 02083763-bbaf-84df-441d-000000000019 32935 1726853728.05021: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 32935 1726853728.05091: no more pending results, returning what we have 32935 1726853728.05094: results queue empty 32935 1726853728.05095: checking for any_errors_fatal 32935 1726853728.05102: done checking for any_errors_fatal 32935 1726853728.05103: checking for max_fail_percentage 32935 1726853728.05105: done checking for max_fail_percentage 32935 1726853728.05106: checking to see if all hosts have failed and the running result is not ok 32935 1726853728.05107: done checking to see if all hosts have failed 32935 1726853728.05107: getting the remaining hosts for this loop 32935 1726853728.05109: done getting the remaining hosts for this loop 32935 1726853728.05112: getting the next task for host managed_node1 32935 1726853728.05119: done getting next task for host managed_node1 32935 1726853728.05122: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 32935 1726853728.05124: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853728.05136: getting variables 32935 1726853728.05138: in VariableManager get_vars() 32935 1726853728.05173: Calling all_inventory to load vars for managed_node1 32935 1726853728.05175: Calling groups_inventory to load vars for managed_node1 32935 1726853728.05177: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853728.05186: Calling all_plugins_play to load vars for managed_node1 32935 1726853728.05188: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853728.05191: Calling groups_plugins_play to load vars for managed_node1 32935 1726853728.05922: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853728.06786: done with get_vars() 32935 1726853728.06802: done getting variables 32935 1726853728.06847: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 13:35:28 -0400 (0:00:00.029) 0:00:13.204 ****** 32935 1726853728.06873: entering _queue_task() for managed_node1/fail 32935 1726853728.07091: worker is 1 (out of 1 available) 32935 1726853728.07105: exiting _queue_task() for managed_node1/fail 32935 1726853728.07119: done queuing things up, now waiting for results queue to drain 32935 1726853728.07120: waiting for pending results... 32935 1726853728.07297: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 32935 1726853728.07393: in run() - task 02083763-bbaf-84df-441d-00000000001a 32935 1726853728.07404: variable 'ansible_search_path' from source: unknown 32935 1726853728.07408: variable 'ansible_search_path' from source: unknown 32935 1726853728.07435: calling self._execute() 32935 1726853728.07505: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853728.07509: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853728.07518: variable 'omit' from source: magic vars 32935 1726853728.07793: variable 'ansible_distribution_major_version' from source: facts 32935 1726853728.07800: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853728.07922: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32935 1726853728.09432: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32935 1726853728.09486: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32935 1726853728.09513: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32935 1726853728.09541: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32935 1726853728.09561: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32935 1726853728.09625: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32935 1726853728.09649: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32935 1726853728.09669: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853728.09696: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32935 1726853728.09706: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32935 1726853728.09781: variable 'ansible_distribution_major_version' from source: facts 32935 1726853728.09793: Evaluated conditional (ansible_distribution_major_version | int > 9): True 32935 1726853728.09875: variable 'ansible_distribution' from source: facts 32935 1726853728.09879: variable '__network_rh_distros' from source: role '' defaults 32935 1726853728.09887: Evaluated conditional (ansible_distribution in __network_rh_distros): True 32935 1726853728.10038: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32935 1726853728.10054: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32935 1726853728.10078: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853728.10103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32935 1726853728.10114: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32935 1726853728.10145: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32935 1726853728.10162: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32935 1726853728.10184: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853728.10209: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32935 1726853728.10219: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32935 1726853728.10248: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32935 1726853728.10266: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32935 1726853728.10288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853728.10313: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32935 1726853728.10323: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32935 1726853728.10518: variable 'network_connections' from source: task vars 32935 1726853728.10528: variable 'interface' from source: play vars 32935 1726853728.10582: variable 'interface' from source: play vars 32935 1726853728.10592: variable 'vlan_interface' from source: play vars 32935 1726853728.10638: variable 'vlan_interface' from source: play vars 32935 1726853728.10644: variable 'interface' from source: play vars 32935 1726853728.10689: variable 'interface' from source: play vars 32935 1726853728.10699: variable 'network_state' from source: role '' defaults 32935 1726853728.10747: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 32935 1726853728.11120: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 32935 1726853728.11146: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 32935 1726853728.11175: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 32935 1726853728.11198: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 32935 1726853728.11230: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 32935 1726853728.11245: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 32935 1726853728.11267: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853728.11289: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 32935 1726853728.11317: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 32935 1726853728.11320: when evaluation is False, skipping this task 32935 1726853728.11323: _execute() done 32935 1726853728.11325: dumping result to json 32935 1726853728.11327: done dumping result, returning 32935 1726853728.11334: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [02083763-bbaf-84df-441d-00000000001a] 32935 1726853728.11337: sending task result for task 02083763-bbaf-84df-441d-00000000001a 32935 1726853728.11424: done sending task result for task 02083763-bbaf-84df-441d-00000000001a 32935 1726853728.11427: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 32935 1726853728.11482: no more pending results, returning what we have 32935 1726853728.11485: results queue empty 32935 1726853728.11486: checking for any_errors_fatal 32935 1726853728.11492: done checking for any_errors_fatal 32935 1726853728.11493: checking for max_fail_percentage 32935 1726853728.11494: done checking for max_fail_percentage 32935 1726853728.11495: checking to see if all hosts have failed and the running result is not ok 32935 1726853728.11496: done checking to see if all hosts have failed 32935 1726853728.11497: getting the remaining hosts for this loop 32935 1726853728.11499: done getting the remaining hosts for this loop 32935 1726853728.11502: getting the next task for host managed_node1 32935 1726853728.11509: done getting next task for host managed_node1 32935 1726853728.11513: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 32935 1726853728.11515: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853728.11528: getting variables 32935 1726853728.11529: in VariableManager get_vars() 32935 1726853728.11573: Calling all_inventory to load vars for managed_node1 32935 1726853728.11575: Calling groups_inventory to load vars for managed_node1 32935 1726853728.11578: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853728.11587: Calling all_plugins_play to load vars for managed_node1 32935 1726853728.11590: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853728.11592: Calling groups_plugins_play to load vars for managed_node1 32935 1726853728.12507: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853728.13389: done with get_vars() 32935 1726853728.13406: done getting variables 32935 1726853728.13483: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 13:35:28 -0400 (0:00:00.066) 0:00:13.270 ****** 32935 1726853728.13506: entering _queue_task() for managed_node1/dnf 32935 1726853728.13750: worker is 1 (out of 1 available) 32935 1726853728.13767: exiting _queue_task() for managed_node1/dnf 32935 1726853728.13782: done queuing things up, now waiting for results queue to drain 32935 1726853728.13784: waiting for pending results... 32935 1726853728.13948: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 32935 1726853728.14041: in run() - task 02083763-bbaf-84df-441d-00000000001b 32935 1726853728.14052: variable 'ansible_search_path' from source: unknown 32935 1726853728.14055: variable 'ansible_search_path' from source: unknown 32935 1726853728.14087: calling self._execute() 32935 1726853728.14155: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853728.14162: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853728.14170: variable 'omit' from source: magic vars 32935 1726853728.14434: variable 'ansible_distribution_major_version' from source: facts 32935 1726853728.14445: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853728.14582: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32935 1726853728.16520: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32935 1726853728.16573: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32935 1726853728.16600: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32935 1726853728.16629: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32935 1726853728.16649: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32935 1726853728.16709: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32935 1726853728.16733: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32935 1726853728.16750: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853728.16779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32935 1726853728.16790: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32935 1726853728.16875: variable 'ansible_distribution' from source: facts 32935 1726853728.16879: variable 'ansible_distribution_major_version' from source: facts 32935 1726853728.16892: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 32935 1726853728.16969: variable '__network_wireless_connections_defined' from source: role '' defaults 32935 1726853728.17052: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32935 1726853728.17072: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32935 1726853728.17088: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853728.17115: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32935 1726853728.17126: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32935 1726853728.17155: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32935 1726853728.17174: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32935 1726853728.17190: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853728.17214: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32935 1726853728.17224: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32935 1726853728.17251: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32935 1726853728.17275: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32935 1726853728.17290: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853728.17314: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32935 1726853728.17324: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32935 1726853728.17425: variable 'network_connections' from source: task vars 32935 1726853728.17436: variable 'interface' from source: play vars 32935 1726853728.17485: variable 'interface' from source: play vars 32935 1726853728.17501: variable 'vlan_interface' from source: play vars 32935 1726853728.17539: variable 'vlan_interface' from source: play vars 32935 1726853728.17544: variable 'interface' from source: play vars 32935 1726853728.17589: variable 'interface' from source: play vars 32935 1726853728.17638: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 32935 1726853728.17751: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 32935 1726853728.17781: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 32935 1726853728.17814: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 32935 1726853728.17839: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 32935 1726853728.17869: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 32935 1726853728.17889: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 32935 1726853728.17907: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853728.17928: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 32935 1726853728.17973: variable '__network_team_connections_defined' from source: role '' defaults 32935 1726853728.18118: variable 'network_connections' from source: task vars 32935 1726853728.18122: variable 'interface' from source: play vars 32935 1726853728.18167: variable 'interface' from source: play vars 32935 1726853728.18176: variable 'vlan_interface' from source: play vars 32935 1726853728.18216: variable 'vlan_interface' from source: play vars 32935 1726853728.18221: variable 'interface' from source: play vars 32935 1726853728.18267: variable 'interface' from source: play vars 32935 1726853728.18294: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 32935 1726853728.18297: when evaluation is False, skipping this task 32935 1726853728.18300: _execute() done 32935 1726853728.18302: dumping result to json 32935 1726853728.18304: done dumping result, returning 32935 1726853728.18312: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [02083763-bbaf-84df-441d-00000000001b] 32935 1726853728.18316: sending task result for task 02083763-bbaf-84df-441d-00000000001b 32935 1726853728.18404: done sending task result for task 02083763-bbaf-84df-441d-00000000001b 32935 1726853728.18406: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 32935 1726853728.18453: no more pending results, returning what we have 32935 1726853728.18456: results queue empty 32935 1726853728.18457: checking for any_errors_fatal 32935 1726853728.18465: done checking for any_errors_fatal 32935 1726853728.18466: checking for max_fail_percentage 32935 1726853728.18467: done checking for max_fail_percentage 32935 1726853728.18468: checking to see if all hosts have failed and the running result is not ok 32935 1726853728.18470: done checking to see if all hosts have failed 32935 1726853728.18473: getting the remaining hosts for this loop 32935 1726853728.18474: done getting the remaining hosts for this loop 32935 1726853728.18478: getting the next task for host managed_node1 32935 1726853728.18485: done getting next task for host managed_node1 32935 1726853728.18489: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 32935 1726853728.18515: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853728.18598: getting variables 32935 1726853728.18600: in VariableManager get_vars() 32935 1726853728.18689: Calling all_inventory to load vars for managed_node1 32935 1726853728.18693: Calling groups_inventory to load vars for managed_node1 32935 1726853728.18695: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853728.18704: Calling all_plugins_play to load vars for managed_node1 32935 1726853728.18707: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853728.18710: Calling groups_plugins_play to load vars for managed_node1 32935 1726853728.20195: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853728.21898: done with get_vars() 32935 1726853728.21931: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 32935 1726853728.22021: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 13:35:28 -0400 (0:00:00.085) 0:00:13.356 ****** 32935 1726853728.22053: entering _queue_task() for managed_node1/yum 32935 1726853728.22055: Creating lock for yum 32935 1726853728.22685: worker is 1 (out of 1 available) 32935 1726853728.22693: exiting _queue_task() for managed_node1/yum 32935 1726853728.22703: done queuing things up, now waiting for results queue to drain 32935 1726853728.22704: waiting for pending results... 32935 1726853728.22945: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 32935 1726853728.22949: in run() - task 02083763-bbaf-84df-441d-00000000001c 32935 1726853728.22953: variable 'ansible_search_path' from source: unknown 32935 1726853728.22954: variable 'ansible_search_path' from source: unknown 32935 1726853728.22996: calling self._execute() 32935 1726853728.23093: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853728.23106: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853728.23121: variable 'omit' from source: magic vars 32935 1726853728.23530: variable 'ansible_distribution_major_version' from source: facts 32935 1726853728.23546: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853728.23733: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32935 1726853728.26543: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32935 1726853728.26610: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32935 1726853728.26638: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32935 1726853728.26681: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32935 1726853728.26704: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32935 1726853728.26766: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32935 1726853728.26790: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32935 1726853728.26808: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853728.26834: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32935 1726853728.26844: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32935 1726853728.26923: variable 'ansible_distribution_major_version' from source: facts 32935 1726853728.26935: Evaluated conditional (ansible_distribution_major_version | int < 8): False 32935 1726853728.26939: when evaluation is False, skipping this task 32935 1726853728.26942: _execute() done 32935 1726853728.26945: dumping result to json 32935 1726853728.26947: done dumping result, returning 32935 1726853728.26955: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [02083763-bbaf-84df-441d-00000000001c] 32935 1726853728.26960: sending task result for task 02083763-bbaf-84df-441d-00000000001c 32935 1726853728.27048: done sending task result for task 02083763-bbaf-84df-441d-00000000001c 32935 1726853728.27051: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 32935 1726853728.27106: no more pending results, returning what we have 32935 1726853728.27109: results queue empty 32935 1726853728.27110: checking for any_errors_fatal 32935 1726853728.27115: done checking for any_errors_fatal 32935 1726853728.27116: checking for max_fail_percentage 32935 1726853728.27118: done checking for max_fail_percentage 32935 1726853728.27119: checking to see if all hosts have failed and the running result is not ok 32935 1726853728.27120: done checking to see if all hosts have failed 32935 1726853728.27121: getting the remaining hosts for this loop 32935 1726853728.27122: done getting the remaining hosts for this loop 32935 1726853728.27125: getting the next task for host managed_node1 32935 1726853728.27132: done getting next task for host managed_node1 32935 1726853728.27135: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 32935 1726853728.27138: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853728.27151: getting variables 32935 1726853728.27153: in VariableManager get_vars() 32935 1726853728.27197: Calling all_inventory to load vars for managed_node1 32935 1726853728.27200: Calling groups_inventory to load vars for managed_node1 32935 1726853728.27202: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853728.27212: Calling all_plugins_play to load vars for managed_node1 32935 1726853728.27215: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853728.27217: Calling groups_plugins_play to load vars for managed_node1 32935 1726853728.28092: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853728.28944: done with get_vars() 32935 1726853728.28965: done getting variables 32935 1726853728.29027: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 13:35:28 -0400 (0:00:00.070) 0:00:13.426 ****** 32935 1726853728.29054: entering _queue_task() for managed_node1/fail 32935 1726853728.29383: worker is 1 (out of 1 available) 32935 1726853728.29398: exiting _queue_task() for managed_node1/fail 32935 1726853728.29411: done queuing things up, now waiting for results queue to drain 32935 1726853728.29412: waiting for pending results... 32935 1726853728.29792: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 32935 1726853728.29840: in run() - task 02083763-bbaf-84df-441d-00000000001d 32935 1726853728.29859: variable 'ansible_search_path' from source: unknown 32935 1726853728.29867: variable 'ansible_search_path' from source: unknown 32935 1726853728.29912: calling self._execute() 32935 1726853728.30004: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853728.30016: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853728.30033: variable 'omit' from source: magic vars 32935 1726853728.30405: variable 'ansible_distribution_major_version' from source: facts 32935 1726853728.30415: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853728.30497: variable '__network_wireless_connections_defined' from source: role '' defaults 32935 1726853728.30625: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32935 1726853728.32377: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32935 1726853728.32382: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32935 1726853728.32393: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32935 1726853728.32430: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32935 1726853728.32461: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32935 1726853728.32541: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32935 1726853728.32585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32935 1726853728.32615: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853728.32659: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32935 1726853728.32683: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32935 1726853728.32732: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32935 1726853728.32765: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32935 1726853728.32796: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853728.32838: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32935 1726853728.32857: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32935 1726853728.32903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32935 1726853728.32935: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32935 1726853728.32956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853728.33021: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32935 1726853728.33024: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32935 1726853728.33133: variable 'network_connections' from source: task vars 32935 1726853728.33144: variable 'interface' from source: play vars 32935 1726853728.33221: variable 'interface' from source: play vars 32935 1726853728.33231: variable 'vlan_interface' from source: play vars 32935 1726853728.33285: variable 'vlan_interface' from source: play vars 32935 1726853728.33289: variable 'interface' from source: play vars 32935 1726853728.33343: variable 'interface' from source: play vars 32935 1726853728.33575: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 32935 1726853728.33579: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 32935 1726853728.33610: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 32935 1726853728.33656: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 32935 1726853728.33692: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 32935 1726853728.33736: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 32935 1726853728.33763: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 32935 1726853728.33798: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853728.33829: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 32935 1726853728.33897: variable '__network_team_connections_defined' from source: role '' defaults 32935 1726853728.34132: variable 'network_connections' from source: task vars 32935 1726853728.34143: variable 'interface' from source: play vars 32935 1726853728.34206: variable 'interface' from source: play vars 32935 1726853728.34222: variable 'vlan_interface' from source: play vars 32935 1726853728.34283: variable 'vlan_interface' from source: play vars 32935 1726853728.34295: variable 'interface' from source: play vars 32935 1726853728.34357: variable 'interface' from source: play vars 32935 1726853728.34398: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 32935 1726853728.34414: when evaluation is False, skipping this task 32935 1726853728.34421: _execute() done 32935 1726853728.34429: dumping result to json 32935 1726853728.34437: done dumping result, returning 32935 1726853728.34449: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [02083763-bbaf-84df-441d-00000000001d] 32935 1726853728.34459: sending task result for task 02083763-bbaf-84df-441d-00000000001d skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 32935 1726853728.34621: no more pending results, returning what we have 32935 1726853728.34624: results queue empty 32935 1726853728.34625: checking for any_errors_fatal 32935 1726853728.34629: done checking for any_errors_fatal 32935 1726853728.34630: checking for max_fail_percentage 32935 1726853728.34632: done checking for max_fail_percentage 32935 1726853728.34632: checking to see if all hosts have failed and the running result is not ok 32935 1726853728.34634: done checking to see if all hosts have failed 32935 1726853728.34635: getting the remaining hosts for this loop 32935 1726853728.34636: done getting the remaining hosts for this loop 32935 1726853728.34640: getting the next task for host managed_node1 32935 1726853728.34647: done getting next task for host managed_node1 32935 1726853728.34650: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 32935 1726853728.34652: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853728.34668: getting variables 32935 1726853728.34669: in VariableManager get_vars() 32935 1726853728.34814: Calling all_inventory to load vars for managed_node1 32935 1726853728.34817: Calling groups_inventory to load vars for managed_node1 32935 1726853728.34819: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853728.34828: Calling all_plugins_play to load vars for managed_node1 32935 1726853728.34830: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853728.34834: Calling groups_plugins_play to load vars for managed_node1 32935 1726853728.35351: done sending task result for task 02083763-bbaf-84df-441d-00000000001d 32935 1726853728.35355: WORKER PROCESS EXITING 32935 1726853728.38370: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853728.42367: done with get_vars() 32935 1726853728.42398: done getting variables 32935 1726853728.42461: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 13:35:28 -0400 (0:00:00.135) 0:00:13.561 ****** 32935 1726853728.42605: entering _queue_task() for managed_node1/package 32935 1726853728.43324: worker is 1 (out of 1 available) 32935 1726853728.43382: exiting _queue_task() for managed_node1/package 32935 1726853728.43396: done queuing things up, now waiting for results queue to drain 32935 1726853728.43445: waiting for pending results... 32935 1726853728.43887: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 32935 1726853728.44077: in run() - task 02083763-bbaf-84df-441d-00000000001e 32935 1726853728.44082: variable 'ansible_search_path' from source: unknown 32935 1726853728.44085: variable 'ansible_search_path' from source: unknown 32935 1726853728.44121: calling self._execute() 32935 1726853728.44214: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853728.44386: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853728.44477: variable 'omit' from source: magic vars 32935 1726853728.45261: variable 'ansible_distribution_major_version' from source: facts 32935 1726853728.45576: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853728.45689: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 32935 1726853728.46144: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 32935 1726853728.46422: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 32935 1726853728.46461: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 32935 1726853728.46500: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 32935 1726853728.46787: variable 'network_packages' from source: role '' defaults 32935 1726853728.47176: variable '__network_provider_setup' from source: role '' defaults 32935 1726853728.47180: variable '__network_service_name_default_nm' from source: role '' defaults 32935 1726853728.47186: variable '__network_service_name_default_nm' from source: role '' defaults 32935 1726853728.47200: variable '__network_packages_default_nm' from source: role '' defaults 32935 1726853728.47264: variable '__network_packages_default_nm' from source: role '' defaults 32935 1726853728.47660: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32935 1726853728.60333: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32935 1726853728.60406: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32935 1726853728.60454: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32935 1726853728.60495: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32935 1726853728.60529: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32935 1726853728.60609: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32935 1726853728.60643: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32935 1726853728.60682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853728.60728: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32935 1726853728.60747: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32935 1726853728.60805: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32935 1726853728.60833: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32935 1726853728.60862: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853728.60911: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32935 1726853728.60930: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32935 1726853728.61178: variable '__network_packages_default_gobject_packages' from source: role '' defaults 32935 1726853728.61299: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32935 1726853728.61334: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32935 1726853728.61363: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853728.61407: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32935 1726853728.61427: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32935 1726853728.61521: variable 'ansible_python' from source: facts 32935 1726853728.61586: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 32935 1726853728.61729: variable '__network_wpa_supplicant_required' from source: role '' defaults 32935 1726853728.61851: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 32935 1726853728.61975: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32935 1726853728.62009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32935 1726853728.62040: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853728.62086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32935 1726853728.62108: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32935 1726853728.62154: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32935 1726853728.62305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32935 1726853728.62308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853728.62311: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32935 1726853728.62313: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32935 1726853728.62430: variable 'network_connections' from source: task vars 32935 1726853728.62441: variable 'interface' from source: play vars 32935 1726853728.62536: variable 'interface' from source: play vars 32935 1726853728.62551: variable 'vlan_interface' from source: play vars 32935 1726853728.62646: variable 'vlan_interface' from source: play vars 32935 1726853728.62658: variable 'interface' from source: play vars 32935 1726853728.62765: variable 'interface' from source: play vars 32935 1726853728.62840: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 32935 1726853728.62880: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 32935 1726853728.62912: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853728.62940: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 32935 1726853728.62985: variable '__network_wireless_connections_defined' from source: role '' defaults 32935 1726853728.63256: variable 'network_connections' from source: task vars 32935 1726853728.63266: variable 'interface' from source: play vars 32935 1726853728.63367: variable 'interface' from source: play vars 32935 1726853728.63386: variable 'vlan_interface' from source: play vars 32935 1726853728.63483: variable 'vlan_interface' from source: play vars 32935 1726853728.63495: variable 'interface' from source: play vars 32935 1726853728.63783: variable 'interface' from source: play vars 32935 1726853728.63788: variable '__network_packages_default_wireless' from source: role '' defaults 32935 1726853728.63790: variable '__network_wireless_connections_defined' from source: role '' defaults 32935 1726853728.64052: variable 'network_connections' from source: task vars 32935 1726853728.64063: variable 'interface' from source: play vars 32935 1726853728.64158: variable 'interface' from source: play vars 32935 1726853728.64315: variable 'vlan_interface' from source: play vars 32935 1726853728.64390: variable 'vlan_interface' from source: play vars 32935 1726853728.64402: variable 'interface' from source: play vars 32935 1726853728.64469: variable 'interface' from source: play vars 32935 1726853728.64516: variable '__network_packages_default_team' from source: role '' defaults 32935 1726853728.64601: variable '__network_team_connections_defined' from source: role '' defaults 32935 1726853728.64902: variable 'network_connections' from source: task vars 32935 1726853728.64912: variable 'interface' from source: play vars 32935 1726853728.64980: variable 'interface' from source: play vars 32935 1726853728.64995: variable 'vlan_interface' from source: play vars 32935 1726853728.65058: variable 'vlan_interface' from source: play vars 32935 1726853728.65070: variable 'interface' from source: play vars 32935 1726853728.65139: variable 'interface' from source: play vars 32935 1726853728.65208: variable '__network_service_name_default_initscripts' from source: role '' defaults 32935 1726853728.65269: variable '__network_service_name_default_initscripts' from source: role '' defaults 32935 1726853728.65294: variable '__network_packages_default_initscripts' from source: role '' defaults 32935 1726853728.65377: variable '__network_packages_default_initscripts' from source: role '' defaults 32935 1726853728.65645: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 32935 1726853728.66161: variable 'network_connections' from source: task vars 32935 1726853728.66176: variable 'interface' from source: play vars 32935 1726853728.66238: variable 'interface' from source: play vars 32935 1726853728.66273: variable 'vlan_interface' from source: play vars 32935 1726853728.66319: variable 'vlan_interface' from source: play vars 32935 1726853728.66381: variable 'interface' from source: play vars 32935 1726853728.66397: variable 'interface' from source: play vars 32935 1726853728.66411: variable 'ansible_distribution' from source: facts 32935 1726853728.66418: variable '__network_rh_distros' from source: role '' defaults 32935 1726853728.66426: variable 'ansible_distribution_major_version' from source: facts 32935 1726853728.66452: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 32935 1726853728.66639: variable 'ansible_distribution' from source: facts 32935 1726853728.66648: variable '__network_rh_distros' from source: role '' defaults 32935 1726853728.66657: variable 'ansible_distribution_major_version' from source: facts 32935 1726853728.66674: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 32935 1726853728.66841: variable 'ansible_distribution' from source: facts 32935 1726853728.66921: variable '__network_rh_distros' from source: role '' defaults 32935 1726853728.66924: variable 'ansible_distribution_major_version' from source: facts 32935 1726853728.66927: variable 'network_provider' from source: set_fact 32935 1726853728.66930: variable 'ansible_facts' from source: unknown 32935 1726853728.67549: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 32935 1726853728.67558: when evaluation is False, skipping this task 32935 1726853728.67570: _execute() done 32935 1726853728.67585: dumping result to json 32935 1726853728.67593: done dumping result, returning 32935 1726853728.67605: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [02083763-bbaf-84df-441d-00000000001e] 32935 1726853728.67613: sending task result for task 02083763-bbaf-84df-441d-00000000001e skipping: [managed_node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 32935 1726853728.67850: no more pending results, returning what we have 32935 1726853728.67853: results queue empty 32935 1726853728.67853: checking for any_errors_fatal 32935 1726853728.67862: done checking for any_errors_fatal 32935 1726853728.67863: checking for max_fail_percentage 32935 1726853728.67864: done checking for max_fail_percentage 32935 1726853728.67865: checking to see if all hosts have failed and the running result is not ok 32935 1726853728.67866: done checking to see if all hosts have failed 32935 1726853728.67867: getting the remaining hosts for this loop 32935 1726853728.67868: done getting the remaining hosts for this loop 32935 1726853728.67874: getting the next task for host managed_node1 32935 1726853728.67880: done getting next task for host managed_node1 32935 1726853728.67883: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 32935 1726853728.67897: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853728.67892: done sending task result for task 02083763-bbaf-84df-441d-00000000001e 32935 1726853728.67912: getting variables 32935 1726853728.67931: in VariableManager get_vars() 32935 1726853728.67928: WORKER PROCESS EXITING 32935 1726853728.67975: Calling all_inventory to load vars for managed_node1 32935 1726853728.68110: Calling groups_inventory to load vars for managed_node1 32935 1726853728.68114: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853728.68123: Calling all_plugins_play to load vars for managed_node1 32935 1726853728.68125: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853728.68128: Calling groups_plugins_play to load vars for managed_node1 32935 1726853728.73854: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853728.75359: done with get_vars() 32935 1726853728.75387: done getting variables 32935 1726853728.75440: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 13:35:28 -0400 (0:00:00.328) 0:00:13.890 ****** 32935 1726853728.75477: entering _queue_task() for managed_node1/package 32935 1726853728.75827: worker is 1 (out of 1 available) 32935 1726853728.75842: exiting _queue_task() for managed_node1/package 32935 1726853728.75855: done queuing things up, now waiting for results queue to drain 32935 1726853728.75857: waiting for pending results... 32935 1726853728.76151: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 32935 1726853728.76321: in run() - task 02083763-bbaf-84df-441d-00000000001f 32935 1726853728.76325: variable 'ansible_search_path' from source: unknown 32935 1726853728.76328: variable 'ansible_search_path' from source: unknown 32935 1726853728.76429: calling self._execute() 32935 1726853728.76467: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853728.76481: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853728.76499: variable 'omit' from source: magic vars 32935 1726853728.76898: variable 'ansible_distribution_major_version' from source: facts 32935 1726853728.76917: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853728.77040: variable 'network_state' from source: role '' defaults 32935 1726853728.77054: Evaluated conditional (network_state != {}): False 32935 1726853728.77078: when evaluation is False, skipping this task 32935 1726853728.77084: _execute() done 32935 1726853728.77086: dumping result to json 32935 1726853728.77088: done dumping result, returning 32935 1726853728.77177: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [02083763-bbaf-84df-441d-00000000001f] 32935 1726853728.77182: sending task result for task 02083763-bbaf-84df-441d-00000000001f skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 32935 1726853728.77293: no more pending results, returning what we have 32935 1726853728.77300: results queue empty 32935 1726853728.77301: checking for any_errors_fatal 32935 1726853728.77310: done checking for any_errors_fatal 32935 1726853728.77310: checking for max_fail_percentage 32935 1726853728.77312: done checking for max_fail_percentage 32935 1726853728.77313: checking to see if all hosts have failed and the running result is not ok 32935 1726853728.77314: done checking to see if all hosts have failed 32935 1726853728.77315: getting the remaining hosts for this loop 32935 1726853728.77316: done getting the remaining hosts for this loop 32935 1726853728.77320: getting the next task for host managed_node1 32935 1726853728.77327: done getting next task for host managed_node1 32935 1726853728.77330: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 32935 1726853728.77332: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853728.77347: getting variables 32935 1726853728.77349: in VariableManager get_vars() 32935 1726853728.77391: Calling all_inventory to load vars for managed_node1 32935 1726853728.77394: Calling groups_inventory to load vars for managed_node1 32935 1726853728.77396: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853728.77586: Calling all_plugins_play to load vars for managed_node1 32935 1726853728.77590: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853728.77593: Calling groups_plugins_play to load vars for managed_node1 32935 1726853728.78201: done sending task result for task 02083763-bbaf-84df-441d-00000000001f 32935 1726853728.78205: WORKER PROCESS EXITING 32935 1726853728.79050: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853728.80659: done with get_vars() 32935 1726853728.80685: done getting variables 32935 1726853728.80749: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 13:35:28 -0400 (0:00:00.053) 0:00:13.943 ****** 32935 1726853728.80782: entering _queue_task() for managed_node1/package 32935 1726853728.81212: worker is 1 (out of 1 available) 32935 1726853728.81222: exiting _queue_task() for managed_node1/package 32935 1726853728.81233: done queuing things up, now waiting for results queue to drain 32935 1726853728.81235: waiting for pending results... 32935 1726853728.81426: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 32935 1726853728.81583: in run() - task 02083763-bbaf-84df-441d-000000000020 32935 1726853728.81602: variable 'ansible_search_path' from source: unknown 32935 1726853728.81609: variable 'ansible_search_path' from source: unknown 32935 1726853728.81654: calling self._execute() 32935 1726853728.81751: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853728.81767: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853728.81787: variable 'omit' from source: magic vars 32935 1726853728.82167: variable 'ansible_distribution_major_version' from source: facts 32935 1726853728.82189: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853728.82327: variable 'network_state' from source: role '' defaults 32935 1726853728.82342: Evaluated conditional (network_state != {}): False 32935 1726853728.82376: when evaluation is False, skipping this task 32935 1726853728.82379: _execute() done 32935 1726853728.82382: dumping result to json 32935 1726853728.82384: done dumping result, returning 32935 1726853728.82387: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [02083763-bbaf-84df-441d-000000000020] 32935 1726853728.82389: sending task result for task 02083763-bbaf-84df-441d-000000000020 32935 1726853728.82609: done sending task result for task 02083763-bbaf-84df-441d-000000000020 32935 1726853728.82612: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 32935 1726853728.82662: no more pending results, returning what we have 32935 1726853728.82665: results queue empty 32935 1726853728.82666: checking for any_errors_fatal 32935 1726853728.82676: done checking for any_errors_fatal 32935 1726853728.82677: checking for max_fail_percentage 32935 1726853728.82679: done checking for max_fail_percentage 32935 1726853728.82680: checking to see if all hosts have failed and the running result is not ok 32935 1726853728.82681: done checking to see if all hosts have failed 32935 1726853728.82682: getting the remaining hosts for this loop 32935 1726853728.82683: done getting the remaining hosts for this loop 32935 1726853728.82687: getting the next task for host managed_node1 32935 1726853728.82695: done getting next task for host managed_node1 32935 1726853728.82699: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 32935 1726853728.82702: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853728.82718: getting variables 32935 1726853728.82720: in VariableManager get_vars() 32935 1726853728.82764: Calling all_inventory to load vars for managed_node1 32935 1726853728.82767: Calling groups_inventory to load vars for managed_node1 32935 1726853728.82769: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853728.82891: Calling all_plugins_play to load vars for managed_node1 32935 1726853728.82894: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853728.82898: Calling groups_plugins_play to load vars for managed_node1 32935 1726853728.84320: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853728.85932: done with get_vars() 32935 1726853728.85959: done getting variables 32935 1726853728.86064: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 13:35:28 -0400 (0:00:00.053) 0:00:13.996 ****** 32935 1726853728.86100: entering _queue_task() for managed_node1/service 32935 1726853728.86102: Creating lock for service 32935 1726853728.86683: worker is 1 (out of 1 available) 32935 1726853728.86691: exiting _queue_task() for managed_node1/service 32935 1726853728.86701: done queuing things up, now waiting for results queue to drain 32935 1726853728.86703: waiting for pending results... 32935 1726853728.86830: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 32935 1726853728.86909: in run() - task 02083763-bbaf-84df-441d-000000000021 32935 1726853728.86933: variable 'ansible_search_path' from source: unknown 32935 1726853728.86942: variable 'ansible_search_path' from source: unknown 32935 1726853728.86984: calling self._execute() 32935 1726853728.87084: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853728.87146: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853728.87150: variable 'omit' from source: magic vars 32935 1726853728.87498: variable 'ansible_distribution_major_version' from source: facts 32935 1726853728.87516: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853728.87637: variable '__network_wireless_connections_defined' from source: role '' defaults 32935 1726853728.87835: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32935 1726853728.89674: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32935 1726853728.89717: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32935 1726853728.89756: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32935 1726853728.89782: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32935 1726853728.89802: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32935 1726853728.89881: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32935 1726853728.89897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32935 1726853728.89953: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853728.90012: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32935 1726853728.90015: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32935 1726853728.90018: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32935 1726853728.90035: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32935 1726853728.90081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853728.90095: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32935 1726853728.90207: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32935 1726853728.90211: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32935 1726853728.90213: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32935 1726853728.90216: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853728.90242: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32935 1726853728.90245: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32935 1726853728.90484: variable 'network_connections' from source: task vars 32935 1726853728.90487: variable 'interface' from source: play vars 32935 1726853728.90490: variable 'interface' from source: play vars 32935 1726853728.90507: variable 'vlan_interface' from source: play vars 32935 1726853728.90561: variable 'vlan_interface' from source: play vars 32935 1726853728.90565: variable 'interface' from source: play vars 32935 1726853728.90621: variable 'interface' from source: play vars 32935 1726853728.90693: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 32935 1726853728.90925: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 32935 1726853728.90928: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 32935 1726853728.90931: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 32935 1726853728.90946: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 32935 1726853728.90988: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 32935 1726853728.91009: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 32935 1726853728.91044: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853728.91075: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 32935 1726853728.91125: variable '__network_team_connections_defined' from source: role '' defaults 32935 1726853728.91296: variable 'network_connections' from source: task vars 32935 1726853728.91299: variable 'interface' from source: play vars 32935 1726853728.91341: variable 'interface' from source: play vars 32935 1726853728.91355: variable 'vlan_interface' from source: play vars 32935 1726853728.91398: variable 'vlan_interface' from source: play vars 32935 1726853728.91403: variable 'interface' from source: play vars 32935 1726853728.91444: variable 'interface' from source: play vars 32935 1726853728.91474: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 32935 1726853728.91489: when evaluation is False, skipping this task 32935 1726853728.91492: _execute() done 32935 1726853728.91495: dumping result to json 32935 1726853728.91497: done dumping result, returning 32935 1726853728.91499: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [02083763-bbaf-84df-441d-000000000021] 32935 1726853728.91501: sending task result for task 02083763-bbaf-84df-441d-000000000021 skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 32935 1726853728.91632: no more pending results, returning what we have 32935 1726853728.91635: results queue empty 32935 1726853728.91636: checking for any_errors_fatal 32935 1726853728.91643: done checking for any_errors_fatal 32935 1726853728.91644: checking for max_fail_percentage 32935 1726853728.91646: done checking for max_fail_percentage 32935 1726853728.91646: checking to see if all hosts have failed and the running result is not ok 32935 1726853728.91647: done checking to see if all hosts have failed 32935 1726853728.91648: getting the remaining hosts for this loop 32935 1726853728.91649: done getting the remaining hosts for this loop 32935 1726853728.91653: getting the next task for host managed_node1 32935 1726853728.91662: done getting next task for host managed_node1 32935 1726853728.91666: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 32935 1726853728.91668: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853728.91685: getting variables 32935 1726853728.91686: in VariableManager get_vars() 32935 1726853728.91728: Calling all_inventory to load vars for managed_node1 32935 1726853728.91731: Calling groups_inventory to load vars for managed_node1 32935 1726853728.91733: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853728.91743: Calling all_plugins_play to load vars for managed_node1 32935 1726853728.91745: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853728.91747: Calling groups_plugins_play to load vars for managed_node1 32935 1726853728.92284: done sending task result for task 02083763-bbaf-84df-441d-000000000021 32935 1726853728.92288: WORKER PROCESS EXITING 32935 1726853728.92733: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853728.93828: done with get_vars() 32935 1726853728.93855: done getting variables 32935 1726853728.93916: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 13:35:28 -0400 (0:00:00.078) 0:00:14.075 ****** 32935 1726853728.93947: entering _queue_task() for managed_node1/service 32935 1726853728.94278: worker is 1 (out of 1 available) 32935 1726853728.94291: exiting _queue_task() for managed_node1/service 32935 1726853728.94303: done queuing things up, now waiting for results queue to drain 32935 1726853728.94305: waiting for pending results... 32935 1726853728.94620: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 32935 1726853728.94714: in run() - task 02083763-bbaf-84df-441d-000000000022 32935 1726853728.94718: variable 'ansible_search_path' from source: unknown 32935 1726853728.94720: variable 'ansible_search_path' from source: unknown 32935 1726853728.94753: calling self._execute() 32935 1726853728.94862: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853728.94866: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853728.94869: variable 'omit' from source: magic vars 32935 1726853728.95174: variable 'ansible_distribution_major_version' from source: facts 32935 1726853728.95184: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853728.95297: variable 'network_provider' from source: set_fact 32935 1726853728.95301: variable 'network_state' from source: role '' defaults 32935 1726853728.95310: Evaluated conditional (network_provider == "nm" or network_state != {}): True 32935 1726853728.95316: variable 'omit' from source: magic vars 32935 1726853728.95351: variable 'omit' from source: magic vars 32935 1726853728.95379: variable 'network_service_name' from source: role '' defaults 32935 1726853728.95430: variable 'network_service_name' from source: role '' defaults 32935 1726853728.95504: variable '__network_provider_setup' from source: role '' defaults 32935 1726853728.95508: variable '__network_service_name_default_nm' from source: role '' defaults 32935 1726853728.95551: variable '__network_service_name_default_nm' from source: role '' defaults 32935 1726853728.95563: variable '__network_packages_default_nm' from source: role '' defaults 32935 1726853728.95606: variable '__network_packages_default_nm' from source: role '' defaults 32935 1726853728.95748: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32935 1726853728.97477: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32935 1726853728.97481: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32935 1726853728.97533: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32935 1726853728.97608: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32935 1726853728.97612: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32935 1726853728.97652: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32935 1726853728.97699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32935 1726853728.97702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853728.97743: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32935 1726853728.97761: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32935 1726853728.97812: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32935 1726853728.97826: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32935 1726853728.97848: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853728.97878: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32935 1726853728.97890: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32935 1726853728.98064: variable '__network_packages_default_gobject_packages' from source: role '' defaults 32935 1726853728.98177: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32935 1726853728.98198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32935 1726853728.98265: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853728.98269: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32935 1726853728.98274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32935 1726853728.98349: variable 'ansible_python' from source: facts 32935 1726853728.98376: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 32935 1726853728.98493: variable '__network_wpa_supplicant_required' from source: role '' defaults 32935 1726853728.98523: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 32935 1726853728.98640: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32935 1726853728.98664: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32935 1726853728.98687: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853728.98735: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32935 1726853728.98738: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32935 1726853728.98777: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32935 1726853728.98816: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32935 1726853728.98843: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853728.98854: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32935 1726853728.98868: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32935 1726853728.99033: variable 'network_connections' from source: task vars 32935 1726853728.99039: variable 'interface' from source: play vars 32935 1726853728.99068: variable 'interface' from source: play vars 32935 1726853728.99126: variable 'vlan_interface' from source: play vars 32935 1726853728.99155: variable 'vlan_interface' from source: play vars 32935 1726853728.99162: variable 'interface' from source: play vars 32935 1726853728.99300: variable 'interface' from source: play vars 32935 1726853728.99332: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 32935 1726853728.99495: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 32935 1726853728.99532: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 32935 1726853728.99564: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 32935 1726853728.99599: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 32935 1726853728.99644: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 32935 1726853728.99668: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 32935 1726853728.99692: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853728.99714: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 32935 1726853728.99752: variable '__network_wireless_connections_defined' from source: role '' defaults 32935 1726853728.99934: variable 'network_connections' from source: task vars 32935 1726853728.99938: variable 'interface' from source: play vars 32935 1726853728.99994: variable 'interface' from source: play vars 32935 1726853729.00004: variable 'vlan_interface' from source: play vars 32935 1726853729.00054: variable 'vlan_interface' from source: play vars 32935 1726853729.00063: variable 'interface' from source: play vars 32935 1726853729.00116: variable 'interface' from source: play vars 32935 1726853729.00152: variable '__network_packages_default_wireless' from source: role '' defaults 32935 1726853729.00210: variable '__network_wireless_connections_defined' from source: role '' defaults 32935 1726853729.00392: variable 'network_connections' from source: task vars 32935 1726853729.00397: variable 'interface' from source: play vars 32935 1726853729.00447: variable 'interface' from source: play vars 32935 1726853729.00454: variable 'vlan_interface' from source: play vars 32935 1726853729.00505: variable 'vlan_interface' from source: play vars 32935 1726853729.00510: variable 'interface' from source: play vars 32935 1726853729.00561: variable 'interface' from source: play vars 32935 1726853729.00580: variable '__network_packages_default_team' from source: role '' defaults 32935 1726853729.00636: variable '__network_team_connections_defined' from source: role '' defaults 32935 1726853729.00818: variable 'network_connections' from source: task vars 32935 1726853729.00821: variable 'interface' from source: play vars 32935 1726853729.00873: variable 'interface' from source: play vars 32935 1726853729.00881: variable 'vlan_interface' from source: play vars 32935 1726853729.00928: variable 'vlan_interface' from source: play vars 32935 1726853729.00933: variable 'interface' from source: play vars 32935 1726853729.00985: variable 'interface' from source: play vars 32935 1726853729.01031: variable '__network_service_name_default_initscripts' from source: role '' defaults 32935 1726853729.01074: variable '__network_service_name_default_initscripts' from source: role '' defaults 32935 1726853729.01080: variable '__network_packages_default_initscripts' from source: role '' defaults 32935 1726853729.01122: variable '__network_packages_default_initscripts' from source: role '' defaults 32935 1726853729.01256: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 32935 1726853729.01745: variable 'network_connections' from source: task vars 32935 1726853729.01748: variable 'interface' from source: play vars 32935 1726853729.01798: variable 'interface' from source: play vars 32935 1726853729.01807: variable 'vlan_interface' from source: play vars 32935 1726853729.01863: variable 'vlan_interface' from source: play vars 32935 1726853729.01866: variable 'interface' from source: play vars 32935 1726853729.01924: variable 'interface' from source: play vars 32935 1726853729.01933: variable 'ansible_distribution' from source: facts 32935 1726853729.01936: variable '__network_rh_distros' from source: role '' defaults 32935 1726853729.01943: variable 'ansible_distribution_major_version' from source: facts 32935 1726853729.01966: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 32935 1726853729.02127: variable 'ansible_distribution' from source: facts 32935 1726853729.02130: variable '__network_rh_distros' from source: role '' defaults 32935 1726853729.02183: variable 'ansible_distribution_major_version' from source: facts 32935 1726853729.02187: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 32935 1726853729.02306: variable 'ansible_distribution' from source: facts 32935 1726853729.02309: variable '__network_rh_distros' from source: role '' defaults 32935 1726853729.02418: variable 'ansible_distribution_major_version' from source: facts 32935 1726853729.02421: variable 'network_provider' from source: set_fact 32935 1726853729.02423: variable 'omit' from source: magic vars 32935 1726853729.02426: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32935 1726853729.02428: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32935 1726853729.02455: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32935 1726853729.02468: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853729.02479: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853729.02511: variable 'inventory_hostname' from source: host vars for 'managed_node1' 32935 1726853729.02515: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853729.02517: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853729.02610: Set connection var ansible_timeout to 10 32935 1726853729.02615: Set connection var ansible_shell_type to sh 32935 1726853729.02623: Set connection var ansible_pipelining to False 32935 1726853729.02625: Set connection var ansible_connection to ssh 32935 1726853729.02630: Set connection var ansible_shell_executable to /bin/sh 32935 1726853729.02636: Set connection var ansible_module_compression to ZIP_DEFLATED 32935 1726853729.02662: variable 'ansible_shell_executable' from source: unknown 32935 1726853729.02665: variable 'ansible_connection' from source: unknown 32935 1726853729.02677: variable 'ansible_module_compression' from source: unknown 32935 1726853729.02680: variable 'ansible_shell_type' from source: unknown 32935 1726853729.02682: variable 'ansible_shell_executable' from source: unknown 32935 1726853729.02684: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853729.02686: variable 'ansible_pipelining' from source: unknown 32935 1726853729.02688: variable 'ansible_timeout' from source: unknown 32935 1726853729.02689: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853729.02749: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32935 1726853729.02757: variable 'omit' from source: magic vars 32935 1726853729.02763: starting attempt loop 32935 1726853729.02765: running the handler 32935 1726853729.02824: variable 'ansible_facts' from source: unknown 32935 1726853729.03301: _low_level_execute_command(): starting 32935 1726853729.03307: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32935 1726853729.03827: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853729.03831: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853729.03834: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853729.03837: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853729.03886: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853729.03889: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853729.03899: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853729.03960: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853729.05641: stdout chunk (state=3): >>>/root <<< 32935 1726853729.05749: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853729.05782: stderr chunk (state=3): >>><<< 32935 1726853729.05785: stdout chunk (state=3): >>><<< 32935 1726853729.05809: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853729.05825: _low_level_execute_command(): starting 32935 1726853729.05829: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853729.0580359-33634-207305666215109 `" && echo ansible-tmp-1726853729.0580359-33634-207305666215109="` echo /root/.ansible/tmp/ansible-tmp-1726853729.0580359-33634-207305666215109 `" ) && sleep 0' 32935 1726853729.06496: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853729.06538: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853729.06565: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853729.06593: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853729.06676: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853729.08531: stdout chunk (state=3): >>>ansible-tmp-1726853729.0580359-33634-207305666215109=/root/.ansible/tmp/ansible-tmp-1726853729.0580359-33634-207305666215109 <<< 32935 1726853729.08631: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853729.08660: stderr chunk (state=3): >>><<< 32935 1726853729.08663: stdout chunk (state=3): >>><<< 32935 1726853729.08678: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853729.0580359-33634-207305666215109=/root/.ansible/tmp/ansible-tmp-1726853729.0580359-33634-207305666215109 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853729.08710: variable 'ansible_module_compression' from source: unknown 32935 1726853729.08751: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 32935 1726853729.08755: ANSIBALLZ: Acquiring lock 32935 1726853729.08760: ANSIBALLZ: Lock acquired: 140683294872048 32935 1726853729.08763: ANSIBALLZ: Creating module 32935 1726853729.31199: ANSIBALLZ: Writing module into payload 32935 1726853729.31429: ANSIBALLZ: Writing module 32935 1726853729.31433: ANSIBALLZ: Renaming module 32935 1726853729.31435: ANSIBALLZ: Done creating module 32935 1726853729.31437: variable 'ansible_facts' from source: unknown 32935 1726853729.31630: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853729.0580359-33634-207305666215109/AnsiballZ_systemd.py 32935 1726853729.31886: Sending initial data 32935 1726853729.31889: Sent initial data (156 bytes) 32935 1726853729.32419: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853729.32422: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853729.32425: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853729.32431: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853729.32485: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853729.32488: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853729.32490: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853729.32545: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853729.34407: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32935 1726853729.34435: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32935 1726853729.34483: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32935vj31k4ae/tmpljy63aan /root/.ansible/tmp/ansible-tmp-1726853729.0580359-33634-207305666215109/AnsiballZ_systemd.py <<< 32935 1726853729.34493: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853729.0580359-33634-207305666215109/AnsiballZ_systemd.py" <<< 32935 1726853729.34523: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-32935vj31k4ae/tmpljy63aan" to remote "/root/.ansible/tmp/ansible-tmp-1726853729.0580359-33634-207305666215109/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853729.0580359-33634-207305666215109/AnsiballZ_systemd.py" <<< 32935 1726853729.36077: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853729.36080: stdout chunk (state=3): >>><<< 32935 1726853729.36083: stderr chunk (state=3): >>><<< 32935 1726853729.36085: done transferring module to remote 32935 1726853729.36087: _low_level_execute_command(): starting 32935 1726853729.36089: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853729.0580359-33634-207305666215109/ /root/.ansible/tmp/ansible-tmp-1726853729.0580359-33634-207305666215109/AnsiballZ_systemd.py && sleep 0' 32935 1726853729.36654: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32935 1726853729.36677: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853729.36710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853729.36713: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32935 1726853729.36715: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 32935 1726853729.36718: stderr chunk (state=3): >>>debug2: match not found <<< 32935 1726853729.36728: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853729.36817: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853729.36835: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853729.36894: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853729.38643: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853729.38705: stderr chunk (state=3): >>><<< 32935 1726853729.38708: stdout chunk (state=3): >>><<< 32935 1726853729.38724: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853729.38728: _low_level_execute_command(): starting 32935 1726853729.38733: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853729.0580359-33634-207305666215109/AnsiballZ_systemd.py && sleep 0' 32935 1726853729.39657: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32935 1726853729.39666: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853729.39679: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853729.39697: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32935 1726853729.39712: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 32935 1726853729.39719: stderr chunk (state=3): >>>debug2: match not found <<< 32935 1726853729.39728: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853729.39742: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 32935 1726853729.39749: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address <<< 32935 1726853729.39756: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 32935 1726853729.39812: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853729.39850: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853729.39890: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853729.39955: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853729.69103: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ExecMainStartTimestampMonotonic": "13747067", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ExecMainHandoffTimestampMonotonic": "13825256", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2977", "MemoryCurrent": "10727424", "MemoryPeak": "14561280", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3325849600", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "1944407000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target dbus.socket system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target cloud-init.service NetworkManager-wait-online.service network.target shutdown.target", "After": "system.slice basic.target systemd-journald.socket cloud-init-local.service sysinit.target dbus.socket network-pre.target dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:33:02 EDT", "StateChangeTimestampMonotonic": "748756263", "InactiveExitTimestamp": "Fri 2024-09-20 13:20:47 EDT", "InactiveExitTimestampMonotonic": "13748890", "ActiveEnterTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ActiveEnterTimestampMonotonic": "14166608", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ConditionTimestampMonotonic": "13745559", "AssertTimestamp": "Fri 2024-09-20 13:20:47 EDT", "AssertTimestampMonotonic": "13745562", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "5f58decfa480494eac8aa3993b4c7ec8", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 32935 1726853729.70742: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853729.70780: stderr chunk (state=3): >>>Shared connection to 10.31.45.153 closed. <<< 32935 1726853729.71086: stderr chunk (state=3): >>><<< 32935 1726853729.71090: stdout chunk (state=3): >>><<< 32935 1726853729.71093: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ExecMainStartTimestampMonotonic": "13747067", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ExecMainHandoffTimestampMonotonic": "13825256", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2977", "MemoryCurrent": "10727424", "MemoryPeak": "14561280", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3325849600", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "1944407000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target dbus.socket system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target cloud-init.service NetworkManager-wait-online.service network.target shutdown.target", "After": "system.slice basic.target systemd-journald.socket cloud-init-local.service sysinit.target dbus.socket network-pre.target dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:33:02 EDT", "StateChangeTimestampMonotonic": "748756263", "InactiveExitTimestamp": "Fri 2024-09-20 13:20:47 EDT", "InactiveExitTimestampMonotonic": "13748890", "ActiveEnterTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ActiveEnterTimestampMonotonic": "14166608", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ConditionTimestampMonotonic": "13745559", "AssertTimestamp": "Fri 2024-09-20 13:20:47 EDT", "AssertTimestampMonotonic": "13745562", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "5f58decfa480494eac8aa3993b4c7ec8", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 32935 1726853729.71276: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853729.0580359-33634-207305666215109/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32935 1726853729.71280: _low_level_execute_command(): starting 32935 1726853729.71282: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853729.0580359-33634-207305666215109/ > /dev/null 2>&1 && sleep 0' 32935 1726853729.71858: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32935 1726853729.71879: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853729.71963: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853729.71986: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853729.71999: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853729.72273: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853729.74126: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853729.74134: stdout chunk (state=3): >>><<< 32935 1726853729.74141: stderr chunk (state=3): >>><<< 32935 1726853729.74187: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853729.74191: handler run complete 32935 1726853729.74501: attempt loop complete, returning result 32935 1726853729.74504: _execute() done 32935 1726853729.74511: dumping result to json 32935 1726853729.74524: done dumping result, returning 32935 1726853729.74533: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [02083763-bbaf-84df-441d-000000000022] 32935 1726853729.74536: sending task result for task 02083763-bbaf-84df-441d-000000000022 ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 32935 1726853729.75364: no more pending results, returning what we have 32935 1726853729.75368: results queue empty 32935 1726853729.75369: checking for any_errors_fatal 32935 1726853729.75378: done checking for any_errors_fatal 32935 1726853729.75379: checking for max_fail_percentage 32935 1726853729.75381: done checking for max_fail_percentage 32935 1726853729.75381: checking to see if all hosts have failed and the running result is not ok 32935 1726853729.75383: done checking to see if all hosts have failed 32935 1726853729.75383: getting the remaining hosts for this loop 32935 1726853729.75385: done getting the remaining hosts for this loop 32935 1726853729.75388: getting the next task for host managed_node1 32935 1726853729.75396: done getting next task for host managed_node1 32935 1726853729.75400: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 32935 1726853729.75403: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853729.75414: getting variables 32935 1726853729.75416: in VariableManager get_vars() 32935 1726853729.75456: Calling all_inventory to load vars for managed_node1 32935 1726853729.75459: Calling groups_inventory to load vars for managed_node1 32935 1726853729.75461: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853729.75472: Calling all_plugins_play to load vars for managed_node1 32935 1726853729.75475: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853729.75478: Calling groups_plugins_play to load vars for managed_node1 32935 1726853729.76279: done sending task result for task 02083763-bbaf-84df-441d-000000000022 32935 1726853729.76284: WORKER PROCESS EXITING 32935 1726853729.78326: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853729.80628: done with get_vars() 32935 1726853729.80648: done getting variables 32935 1726853729.80697: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 13:35:29 -0400 (0:00:00.867) 0:00:14.942 ****** 32935 1726853729.80723: entering _queue_task() for managed_node1/service 32935 1726853729.80984: worker is 1 (out of 1 available) 32935 1726853729.80998: exiting _queue_task() for managed_node1/service 32935 1726853729.81012: done queuing things up, now waiting for results queue to drain 32935 1726853729.81014: waiting for pending results... 32935 1726853729.81252: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 32935 1726853729.81577: in run() - task 02083763-bbaf-84df-441d-000000000023 32935 1726853729.81581: variable 'ansible_search_path' from source: unknown 32935 1726853729.81584: variable 'ansible_search_path' from source: unknown 32935 1726853729.81587: calling self._execute() 32935 1726853729.81589: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853729.81592: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853729.81595: variable 'omit' from source: magic vars 32935 1726853729.81914: variable 'ansible_distribution_major_version' from source: facts 32935 1726853729.81931: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853729.82046: variable 'network_provider' from source: set_fact 32935 1726853729.82050: Evaluated conditional (network_provider == "nm"): True 32935 1726853729.82160: variable '__network_wpa_supplicant_required' from source: role '' defaults 32935 1726853729.82314: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 32935 1726853729.82521: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32935 1726853729.84977: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32935 1726853729.85025: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32935 1726853729.85053: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32935 1726853729.85082: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32935 1726853729.85105: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32935 1726853729.85176: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32935 1726853729.85196: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32935 1726853729.85216: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853729.85244: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32935 1726853729.85254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32935 1726853729.85292: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32935 1726853729.85308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32935 1726853729.85327: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853729.85353: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32935 1726853729.85366: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32935 1726853729.85396: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32935 1726853729.85412: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32935 1726853729.85430: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853729.85457: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32935 1726853729.85469: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32935 1726853729.85561: variable 'network_connections' from source: task vars 32935 1726853729.85579: variable 'interface' from source: play vars 32935 1726853729.85630: variable 'interface' from source: play vars 32935 1726853729.85645: variable 'vlan_interface' from source: play vars 32935 1726853729.85690: variable 'vlan_interface' from source: play vars 32935 1726853729.85696: variable 'interface' from source: play vars 32935 1726853729.85736: variable 'interface' from source: play vars 32935 1726853729.85793: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 32935 1726853729.85904: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 32935 1726853729.85931: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 32935 1726853729.85952: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 32935 1726853729.85978: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 32935 1726853729.86009: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 32935 1726853729.86024: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 32935 1726853729.86041: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853729.86058: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 32935 1726853729.86101: variable '__network_wireless_connections_defined' from source: role '' defaults 32935 1726853729.86280: variable 'network_connections' from source: task vars 32935 1726853729.86283: variable 'interface' from source: play vars 32935 1726853729.86414: variable 'interface' from source: play vars 32935 1726853729.86417: variable 'vlan_interface' from source: play vars 32935 1726853729.86576: variable 'vlan_interface' from source: play vars 32935 1726853729.86579: variable 'interface' from source: play vars 32935 1726853729.86590: variable 'interface' from source: play vars 32935 1726853729.86593: Evaluated conditional (__network_wpa_supplicant_required): False 32935 1726853729.86595: when evaluation is False, skipping this task 32935 1726853729.86597: _execute() done 32935 1726853729.86599: dumping result to json 32935 1726853729.86601: done dumping result, returning 32935 1726853729.86745: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [02083763-bbaf-84df-441d-000000000023] 32935 1726853729.86749: sending task result for task 02083763-bbaf-84df-441d-000000000023 32935 1726853729.87042: done sending task result for task 02083763-bbaf-84df-441d-000000000023 32935 1726853729.87045: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 32935 1726853729.87095: no more pending results, returning what we have 32935 1726853729.87098: results queue empty 32935 1726853729.87099: checking for any_errors_fatal 32935 1726853729.87116: done checking for any_errors_fatal 32935 1726853729.87116: checking for max_fail_percentage 32935 1726853729.87118: done checking for max_fail_percentage 32935 1726853729.87119: checking to see if all hosts have failed and the running result is not ok 32935 1726853729.87174: done checking to see if all hosts have failed 32935 1726853729.87175: getting the remaining hosts for this loop 32935 1726853729.87177: done getting the remaining hosts for this loop 32935 1726853729.87181: getting the next task for host managed_node1 32935 1726853729.87187: done getting next task for host managed_node1 32935 1726853729.87191: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 32935 1726853729.87193: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853729.87207: getting variables 32935 1726853729.87209: in VariableManager get_vars() 32935 1726853729.87298: Calling all_inventory to load vars for managed_node1 32935 1726853729.87301: Calling groups_inventory to load vars for managed_node1 32935 1726853729.87303: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853729.87313: Calling all_plugins_play to load vars for managed_node1 32935 1726853729.87315: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853729.87318: Calling groups_plugins_play to load vars for managed_node1 32935 1726853729.88903: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853729.90545: done with get_vars() 32935 1726853729.90580: done getting variables 32935 1726853729.90644: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 13:35:29 -0400 (0:00:00.099) 0:00:15.042 ****** 32935 1726853729.90678: entering _queue_task() for managed_node1/service 32935 1726853729.91040: worker is 1 (out of 1 available) 32935 1726853729.91054: exiting _queue_task() for managed_node1/service 32935 1726853729.91068: done queuing things up, now waiting for results queue to drain 32935 1726853729.91070: waiting for pending results... 32935 1726853729.91591: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 32935 1726853729.91597: in run() - task 02083763-bbaf-84df-441d-000000000024 32935 1726853729.91600: variable 'ansible_search_path' from source: unknown 32935 1726853729.91603: variable 'ansible_search_path' from source: unknown 32935 1726853729.91777: calling self._execute() 32935 1726853729.91782: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853729.91786: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853729.91789: variable 'omit' from source: magic vars 32935 1726853729.92035: variable 'ansible_distribution_major_version' from source: facts 32935 1726853729.92056: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853729.92184: variable 'network_provider' from source: set_fact 32935 1726853729.92190: Evaluated conditional (network_provider == "initscripts"): False 32935 1726853729.92193: when evaluation is False, skipping this task 32935 1726853729.92196: _execute() done 32935 1726853729.92198: dumping result to json 32935 1726853729.92201: done dumping result, returning 32935 1726853729.92209: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [02083763-bbaf-84df-441d-000000000024] 32935 1726853729.92214: sending task result for task 02083763-bbaf-84df-441d-000000000024 32935 1726853729.92305: done sending task result for task 02083763-bbaf-84df-441d-000000000024 32935 1726853729.92308: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 32935 1726853729.92354: no more pending results, returning what we have 32935 1726853729.92363: results queue empty 32935 1726853729.92365: checking for any_errors_fatal 32935 1726853729.92377: done checking for any_errors_fatal 32935 1726853729.92378: checking for max_fail_percentage 32935 1726853729.92380: done checking for max_fail_percentage 32935 1726853729.92381: checking to see if all hosts have failed and the running result is not ok 32935 1726853729.92382: done checking to see if all hosts have failed 32935 1726853729.92383: getting the remaining hosts for this loop 32935 1726853729.92384: done getting the remaining hosts for this loop 32935 1726853729.92388: getting the next task for host managed_node1 32935 1726853729.92396: done getting next task for host managed_node1 32935 1726853729.92399: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 32935 1726853729.92402: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853729.92418: getting variables 32935 1726853729.92420: in VariableManager get_vars() 32935 1726853729.92462: Calling all_inventory to load vars for managed_node1 32935 1726853729.92465: Calling groups_inventory to load vars for managed_node1 32935 1726853729.92467: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853729.92597: Calling all_plugins_play to load vars for managed_node1 32935 1726853729.92601: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853729.92604: Calling groups_plugins_play to load vars for managed_node1 32935 1726853729.94347: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853729.96032: done with get_vars() 32935 1726853729.96070: done getting variables 32935 1726853729.96132: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 13:35:29 -0400 (0:00:00.054) 0:00:15.097 ****** 32935 1726853729.96165: entering _queue_task() for managed_node1/copy 32935 1726853729.96697: worker is 1 (out of 1 available) 32935 1726853729.96708: exiting _queue_task() for managed_node1/copy 32935 1726853729.96720: done queuing things up, now waiting for results queue to drain 32935 1726853729.96722: waiting for pending results... 32935 1726853729.96873: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 32935 1726853729.97004: in run() - task 02083763-bbaf-84df-441d-000000000025 32935 1726853729.97018: variable 'ansible_search_path' from source: unknown 32935 1726853729.97022: variable 'ansible_search_path' from source: unknown 32935 1726853729.97075: calling self._execute() 32935 1726853729.97182: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853729.97186: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853729.97198: variable 'omit' from source: magic vars 32935 1726853729.97595: variable 'ansible_distribution_major_version' from source: facts 32935 1726853729.97617: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853729.97742: variable 'network_provider' from source: set_fact 32935 1726853729.97777: Evaluated conditional (network_provider == "initscripts"): False 32935 1726853729.97780: when evaluation is False, skipping this task 32935 1726853729.97783: _execute() done 32935 1726853729.97785: dumping result to json 32935 1726853729.97787: done dumping result, returning 32935 1726853729.97790: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [02083763-bbaf-84df-441d-000000000025] 32935 1726853729.97793: sending task result for task 02083763-bbaf-84df-441d-000000000025 32935 1726853729.97868: done sending task result for task 02083763-bbaf-84df-441d-000000000025 skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 32935 1726853729.97920: no more pending results, returning what we have 32935 1726853729.97929: results queue empty 32935 1726853729.97930: checking for any_errors_fatal 32935 1726853729.97939: done checking for any_errors_fatal 32935 1726853729.97940: checking for max_fail_percentage 32935 1726853729.97942: done checking for max_fail_percentage 32935 1726853729.97943: checking to see if all hosts have failed and the running result is not ok 32935 1726853729.97944: done checking to see if all hosts have failed 32935 1726853729.97945: getting the remaining hosts for this loop 32935 1726853729.97947: done getting the remaining hosts for this loop 32935 1726853729.97951: getting the next task for host managed_node1 32935 1726853729.97963: done getting next task for host managed_node1 32935 1726853729.97968: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 32935 1726853729.97974: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853729.97994: WORKER PROCESS EXITING 32935 1726853729.98043: getting variables 32935 1726853729.98045: in VariableManager get_vars() 32935 1726853729.98095: Calling all_inventory to load vars for managed_node1 32935 1726853729.98148: Calling groups_inventory to load vars for managed_node1 32935 1726853729.98152: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853729.98167: Calling all_plugins_play to load vars for managed_node1 32935 1726853729.98170: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853729.98175: Calling groups_plugins_play to load vars for managed_node1 32935 1726853729.99794: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853730.01519: done with get_vars() 32935 1726853730.01544: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 13:35:30 -0400 (0:00:00.054) 0:00:15.151 ****** 32935 1726853730.01636: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 32935 1726853730.01638: Creating lock for fedora.linux_system_roles.network_connections 32935 1726853730.02176: worker is 1 (out of 1 available) 32935 1726853730.02186: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 32935 1726853730.02196: done queuing things up, now waiting for results queue to drain 32935 1726853730.02197: waiting for pending results... 32935 1726853730.02333: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 32935 1726853730.02479: in run() - task 02083763-bbaf-84df-441d-000000000026 32935 1726853730.02620: variable 'ansible_search_path' from source: unknown 32935 1726853730.02623: variable 'ansible_search_path' from source: unknown 32935 1726853730.02626: calling self._execute() 32935 1726853730.02777: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853730.02781: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853730.02784: variable 'omit' from source: magic vars 32935 1726853730.03101: variable 'ansible_distribution_major_version' from source: facts 32935 1726853730.03113: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853730.03120: variable 'omit' from source: magic vars 32935 1726853730.03179: variable 'omit' from source: magic vars 32935 1726853730.03370: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32935 1726853730.05264: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32935 1726853730.05339: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32935 1726853730.05359: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32935 1726853730.05789: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32935 1726853730.05793: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32935 1726853730.05795: variable 'network_provider' from source: set_fact 32935 1726853730.05798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32935 1726853730.05800: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32935 1726853730.05803: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853730.05804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32935 1726853730.05806: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32935 1726853730.05863: variable 'omit' from source: magic vars 32935 1726853730.05987: variable 'omit' from source: magic vars 32935 1726853730.06092: variable 'network_connections' from source: task vars 32935 1726853730.06105: variable 'interface' from source: play vars 32935 1726853730.06179: variable 'interface' from source: play vars 32935 1726853730.06195: variable 'vlan_interface' from source: play vars 32935 1726853730.06268: variable 'vlan_interface' from source: play vars 32935 1726853730.06276: variable 'interface' from source: play vars 32935 1726853730.06331: variable 'interface' from source: play vars 32935 1726853730.06548: variable 'omit' from source: magic vars 32935 1726853730.06557: variable '__lsr_ansible_managed' from source: task vars 32935 1726853730.06627: variable '__lsr_ansible_managed' from source: task vars 32935 1726853730.06942: Loaded config def from plugin (lookup/template) 32935 1726853730.06947: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 32935 1726853730.06949: File lookup term: get_ansible_managed.j2 32935 1726853730.06951: variable 'ansible_search_path' from source: unknown 32935 1726853730.06978: evaluation_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 32935 1726853730.06982: search_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 32935 1726853730.06989: variable 'ansible_search_path' from source: unknown 32935 1726853730.12575: variable 'ansible_managed' from source: unknown 32935 1726853730.12704: variable 'omit' from source: magic vars 32935 1726853730.12736: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32935 1726853730.12768: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32935 1726853730.12794: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32935 1726853730.12815: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853730.12829: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853730.12860: variable 'inventory_hostname' from source: host vars for 'managed_node1' 32935 1726853730.12868: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853730.12877: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853730.13075: Set connection var ansible_timeout to 10 32935 1726853730.13078: Set connection var ansible_shell_type to sh 32935 1726853730.13080: Set connection var ansible_pipelining to False 32935 1726853730.13082: Set connection var ansible_connection to ssh 32935 1726853730.13084: Set connection var ansible_shell_executable to /bin/sh 32935 1726853730.13086: Set connection var ansible_module_compression to ZIP_DEFLATED 32935 1726853730.13088: variable 'ansible_shell_executable' from source: unknown 32935 1726853730.13089: variable 'ansible_connection' from source: unknown 32935 1726853730.13092: variable 'ansible_module_compression' from source: unknown 32935 1726853730.13093: variable 'ansible_shell_type' from source: unknown 32935 1726853730.13095: variable 'ansible_shell_executable' from source: unknown 32935 1726853730.13097: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853730.13098: variable 'ansible_pipelining' from source: unknown 32935 1726853730.13100: variable 'ansible_timeout' from source: unknown 32935 1726853730.13102: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853730.13204: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 32935 1726853730.13219: variable 'omit' from source: magic vars 32935 1726853730.13232: starting attempt loop 32935 1726853730.13239: running the handler 32935 1726853730.13255: _low_level_execute_command(): starting 32935 1726853730.13266: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32935 1726853730.13942: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32935 1726853730.13958: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853730.13975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853730.13996: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32935 1726853730.14014: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 32935 1726853730.14027: stderr chunk (state=3): >>>debug2: match not found <<< 32935 1726853730.14040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853730.14060: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 32935 1726853730.14075: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address <<< 32935 1726853730.14087: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 32935 1726853730.14169: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853730.14196: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853730.14274: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853730.15956: stdout chunk (state=3): >>>/root <<< 32935 1726853730.16140: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853730.16191: stderr chunk (state=3): >>><<< 32935 1726853730.16194: stdout chunk (state=3): >>><<< 32935 1726853730.16218: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853730.16233: _low_level_execute_command(): starting 32935 1726853730.16240: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853730.1621857-33690-42874536784004 `" && echo ansible-tmp-1726853730.1621857-33690-42874536784004="` echo /root/.ansible/tmp/ansible-tmp-1726853730.1621857-33690-42874536784004 `" ) && sleep 0' 32935 1726853730.16906: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32935 1726853730.16925: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853730.17070: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853730.17076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32935 1726853730.17079: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 32935 1726853730.17081: stderr chunk (state=3): >>>debug2: match not found <<< 32935 1726853730.17083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853730.17085: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 32935 1726853730.17087: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address <<< 32935 1726853730.17089: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 32935 1726853730.17091: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853730.17093: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853730.17115: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853730.17147: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853730.17201: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853730.19104: stdout chunk (state=3): >>>ansible-tmp-1726853730.1621857-33690-42874536784004=/root/.ansible/tmp/ansible-tmp-1726853730.1621857-33690-42874536784004 <<< 32935 1726853730.19202: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853730.19376: stderr chunk (state=3): >>><<< 32935 1726853730.19379: stdout chunk (state=3): >>><<< 32935 1726853730.19381: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853730.1621857-33690-42874536784004=/root/.ansible/tmp/ansible-tmp-1726853730.1621857-33690-42874536784004 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853730.19387: variable 'ansible_module_compression' from source: unknown 32935 1726853730.19389: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 32935 1726853730.19391: ANSIBALLZ: Acquiring lock 32935 1726853730.19393: ANSIBALLZ: Lock acquired: 140683289737984 32935 1726853730.19395: ANSIBALLZ: Creating module 32935 1726853730.39016: ANSIBALLZ: Writing module into payload 32935 1726853730.39355: ANSIBALLZ: Writing module 32935 1726853730.39386: ANSIBALLZ: Renaming module 32935 1726853730.39397: ANSIBALLZ: Done creating module 32935 1726853730.39432: variable 'ansible_facts' from source: unknown 32935 1726853730.39558: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853730.1621857-33690-42874536784004/AnsiballZ_network_connections.py 32935 1726853730.39803: Sending initial data 32935 1726853730.39806: Sent initial data (167 bytes) 32935 1726853730.40341: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32935 1726853730.40386: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853730.40449: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853730.40480: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853730.40547: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853730.42230: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32935 1726853730.42267: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32935 1726853730.42326: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32935vj31k4ae/tmp7a9fo6nj /root/.ansible/tmp/ansible-tmp-1726853730.1621857-33690-42874536784004/AnsiballZ_network_connections.py <<< 32935 1726853730.42330: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853730.1621857-33690-42874536784004/AnsiballZ_network_connections.py" <<< 32935 1726853730.42361: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-32935vj31k4ae/tmp7a9fo6nj" to remote "/root/.ansible/tmp/ansible-tmp-1726853730.1621857-33690-42874536784004/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853730.1621857-33690-42874536784004/AnsiballZ_network_connections.py" <<< 32935 1726853730.43347: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853730.43375: stderr chunk (state=3): >>><<< 32935 1726853730.43378: stdout chunk (state=3): >>><<< 32935 1726853730.43426: done transferring module to remote 32935 1726853730.43442: _low_level_execute_command(): starting 32935 1726853730.43462: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853730.1621857-33690-42874536784004/ /root/.ansible/tmp/ansible-tmp-1726853730.1621857-33690-42874536784004/AnsiballZ_network_connections.py && sleep 0' 32935 1726853730.44169: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32935 1726853730.44184: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853730.44207: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853730.44221: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853730.44298: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853730.46088: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853730.46113: stderr chunk (state=3): >>><<< 32935 1726853730.46126: stdout chunk (state=3): >>><<< 32935 1726853730.46146: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853730.46231: _low_level_execute_command(): starting 32935 1726853730.46235: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853730.1621857-33690-42874536784004/AnsiballZ_network_connections.py && sleep 0' 32935 1726853730.46767: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32935 1726853730.46784: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853730.46801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853730.46827: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32935 1726853730.46842: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 32935 1726853730.46885: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853730.46952: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853730.46969: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853730.47010: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853730.47063: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853730.81774: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[005] #0, state:up persistent_state:present, 'lsr101': add connection lsr101, e9b344ac-7aa9-4d34-9c01-f1b4dd46183f\n[006] #1, state:up persistent_state:present, 'lsr101.90': add connection lsr101.90, 84991582-42ea-41a9-ba62-c7b3edc4be1a\n[007] #0, state:up persistent_state:present, 'lsr101': up connection lsr101, e9b344ac-7aa9-4d34-9c01-f1b4dd46183f (not-active)\n[008] #1, state:up persistent_state:present, 'lsr101.90': up connection lsr101.90, 84991582-42ea-41a9-ba62-c7b3edc4be1a (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr101", "type": "ethernet", "state": "up", "mtu": 1492, "autoconnect": false, "ip": {"dhcp4": false, "auto6": false}}, {"name": "lsr101.90", "parent": "lsr101", "type": "vlan", "vlan_id": 90, "mtu": 1280, "state": "up", "autoconnect": false, "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr101", "type": "ethernet", "state": "up", "mtu": 1492, "autoconnect": false, "ip": {"dhcp4": false, "auto6": false}}, {"name": "lsr101.90", "parent": "lsr101", "type": "vlan", "vlan_id": 90, "mtu": 1280, "state": "up", "autoconnect": false, "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 32935 1726853730.85348: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 32935 1726853730.85352: stdout chunk (state=3): >>><<< 32935 1726853730.85354: stderr chunk (state=3): >>><<< 32935 1726853730.85362: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[005] #0, state:up persistent_state:present, 'lsr101': add connection lsr101, e9b344ac-7aa9-4d34-9c01-f1b4dd46183f\n[006] #1, state:up persistent_state:present, 'lsr101.90': add connection lsr101.90, 84991582-42ea-41a9-ba62-c7b3edc4be1a\n[007] #0, state:up persistent_state:present, 'lsr101': up connection lsr101, e9b344ac-7aa9-4d34-9c01-f1b4dd46183f (not-active)\n[008] #1, state:up persistent_state:present, 'lsr101.90': up connection lsr101.90, 84991582-42ea-41a9-ba62-c7b3edc4be1a (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr101", "type": "ethernet", "state": "up", "mtu": 1492, "autoconnect": false, "ip": {"dhcp4": false, "auto6": false}}, {"name": "lsr101.90", "parent": "lsr101", "type": "vlan", "vlan_id": 90, "mtu": 1280, "state": "up", "autoconnect": false, "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr101", "type": "ethernet", "state": "up", "mtu": 1492, "autoconnect": false, "ip": {"dhcp4": false, "auto6": false}}, {"name": "lsr101.90", "parent": "lsr101", "type": "vlan", "vlan_id": 90, "mtu": 1280, "state": "up", "autoconnect": false, "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 32935 1726853730.85365: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'lsr101', 'type': 'ethernet', 'state': 'up', 'mtu': 1492, 'autoconnect': False, 'ip': {'dhcp4': False, 'auto6': False}}, {'name': 'lsr101.90', 'parent': 'lsr101', 'type': 'vlan', 'vlan_id': 90, 'mtu': 1280, 'state': 'up', 'autoconnect': False, 'ip': {'dhcp4': False, 'auto6': False}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853730.1621857-33690-42874536784004/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32935 1726853730.85367: _low_level_execute_command(): starting 32935 1726853730.85369: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853730.1621857-33690-42874536784004/ > /dev/null 2>&1 && sleep 0' 32935 1726853730.85748: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853730.85752: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 32935 1726853730.85785: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853730.85792: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 32935 1726853730.85795: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853730.85842: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853730.85846: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853730.85862: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853730.85900: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853730.87750: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853730.87783: stderr chunk (state=3): >>><<< 32935 1726853730.87786: stdout chunk (state=3): >>><<< 32935 1726853730.87801: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853730.87806: handler run complete 32935 1726853730.87837: attempt loop complete, returning result 32935 1726853730.87840: _execute() done 32935 1726853730.87842: dumping result to json 32935 1726853730.87848: done dumping result, returning 32935 1726853730.87856: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [02083763-bbaf-84df-441d-000000000026] 32935 1726853730.87859: sending task result for task 02083763-bbaf-84df-441d-000000000026 32935 1726853730.87966: done sending task result for task 02083763-bbaf-84df-441d-000000000026 32935 1726853730.87969: WORKER PROCESS EXITING changed: [managed_node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": false, "ip": { "auto6": false, "dhcp4": false }, "mtu": 1492, "name": "lsr101", "state": "up", "type": "ethernet" }, { "autoconnect": false, "ip": { "auto6": false, "dhcp4": false }, "mtu": 1280, "name": "lsr101.90", "parent": "lsr101", "state": "up", "type": "vlan", "vlan_id": 90 } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [005] #0, state:up persistent_state:present, 'lsr101': add connection lsr101, e9b344ac-7aa9-4d34-9c01-f1b4dd46183f [006] #1, state:up persistent_state:present, 'lsr101.90': add connection lsr101.90, 84991582-42ea-41a9-ba62-c7b3edc4be1a [007] #0, state:up persistent_state:present, 'lsr101': up connection lsr101, e9b344ac-7aa9-4d34-9c01-f1b4dd46183f (not-active) [008] #1, state:up persistent_state:present, 'lsr101.90': up connection lsr101.90, 84991582-42ea-41a9-ba62-c7b3edc4be1a (not-active) 32935 1726853730.88107: no more pending results, returning what we have 32935 1726853730.88110: results queue empty 32935 1726853730.88111: checking for any_errors_fatal 32935 1726853730.88120: done checking for any_errors_fatal 32935 1726853730.88120: checking for max_fail_percentage 32935 1726853730.88122: done checking for max_fail_percentage 32935 1726853730.88123: checking to see if all hosts have failed and the running result is not ok 32935 1726853730.88124: done checking to see if all hosts have failed 32935 1726853730.88124: getting the remaining hosts for this loop 32935 1726853730.88126: done getting the remaining hosts for this loop 32935 1726853730.88129: getting the next task for host managed_node1 32935 1726853730.88135: done getting next task for host managed_node1 32935 1726853730.88138: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 32935 1726853730.88141: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853730.88150: getting variables 32935 1726853730.88152: in VariableManager get_vars() 32935 1726853730.88197: Calling all_inventory to load vars for managed_node1 32935 1726853730.88199: Calling groups_inventory to load vars for managed_node1 32935 1726853730.88202: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853730.88211: Calling all_plugins_play to load vars for managed_node1 32935 1726853730.88213: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853730.88215: Calling groups_plugins_play to load vars for managed_node1 32935 1726853730.89163: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853730.90017: done with get_vars() 32935 1726853730.90035: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 13:35:30 -0400 (0:00:00.884) 0:00:16.036 ****** 32935 1726853730.90101: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 32935 1726853730.90102: Creating lock for fedora.linux_system_roles.network_state 32935 1726853730.90351: worker is 1 (out of 1 available) 32935 1726853730.90366: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 32935 1726853730.90382: done queuing things up, now waiting for results queue to drain 32935 1726853730.90384: waiting for pending results... 32935 1726853730.90555: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 32935 1726853730.90648: in run() - task 02083763-bbaf-84df-441d-000000000027 32935 1726853730.90659: variable 'ansible_search_path' from source: unknown 32935 1726853730.90662: variable 'ansible_search_path' from source: unknown 32935 1726853730.90698: calling self._execute() 32935 1726853730.90773: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853730.90778: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853730.90789: variable 'omit' from source: magic vars 32935 1726853730.91066: variable 'ansible_distribution_major_version' from source: facts 32935 1726853730.91077: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853730.91156: variable 'network_state' from source: role '' defaults 32935 1726853730.91165: Evaluated conditional (network_state != {}): False 32935 1726853730.91168: when evaluation is False, skipping this task 32935 1726853730.91173: _execute() done 32935 1726853730.91175: dumping result to json 32935 1726853730.91179: done dumping result, returning 32935 1726853730.91186: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [02083763-bbaf-84df-441d-000000000027] 32935 1726853730.91191: sending task result for task 02083763-bbaf-84df-441d-000000000027 32935 1726853730.91269: done sending task result for task 02083763-bbaf-84df-441d-000000000027 32935 1726853730.91274: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 32935 1726853730.91321: no more pending results, returning what we have 32935 1726853730.91325: results queue empty 32935 1726853730.91326: checking for any_errors_fatal 32935 1726853730.91338: done checking for any_errors_fatal 32935 1726853730.91338: checking for max_fail_percentage 32935 1726853730.91340: done checking for max_fail_percentage 32935 1726853730.91340: checking to see if all hosts have failed and the running result is not ok 32935 1726853730.91342: done checking to see if all hosts have failed 32935 1726853730.91342: getting the remaining hosts for this loop 32935 1726853730.91344: done getting the remaining hosts for this loop 32935 1726853730.91347: getting the next task for host managed_node1 32935 1726853730.91356: done getting next task for host managed_node1 32935 1726853730.91359: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 32935 1726853730.91362: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853730.91383: getting variables 32935 1726853730.91385: in VariableManager get_vars() 32935 1726853730.91422: Calling all_inventory to load vars for managed_node1 32935 1726853730.91424: Calling groups_inventory to load vars for managed_node1 32935 1726853730.91426: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853730.91435: Calling all_plugins_play to load vars for managed_node1 32935 1726853730.91438: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853730.91440: Calling groups_plugins_play to load vars for managed_node1 32935 1726853730.92204: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853730.93165: done with get_vars() 32935 1726853730.93181: done getting variables 32935 1726853730.93224: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 13:35:30 -0400 (0:00:00.031) 0:00:16.068 ****** 32935 1726853730.93245: entering _queue_task() for managed_node1/debug 32935 1726853730.93466: worker is 1 (out of 1 available) 32935 1726853730.93481: exiting _queue_task() for managed_node1/debug 32935 1726853730.93494: done queuing things up, now waiting for results queue to drain 32935 1726853730.93496: waiting for pending results... 32935 1726853730.93663: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 32935 1726853730.93752: in run() - task 02083763-bbaf-84df-441d-000000000028 32935 1726853730.93767: variable 'ansible_search_path' from source: unknown 32935 1726853730.93772: variable 'ansible_search_path' from source: unknown 32935 1726853730.93801: calling self._execute() 32935 1726853730.93873: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853730.93878: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853730.93888: variable 'omit' from source: magic vars 32935 1726853730.94152: variable 'ansible_distribution_major_version' from source: facts 32935 1726853730.94165: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853730.94172: variable 'omit' from source: magic vars 32935 1726853730.94208: variable 'omit' from source: magic vars 32935 1726853730.94232: variable 'omit' from source: magic vars 32935 1726853730.94273: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32935 1726853730.94300: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32935 1726853730.94320: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32935 1726853730.94362: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853730.94365: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853730.94380: variable 'inventory_hostname' from source: host vars for 'managed_node1' 32935 1726853730.94385: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853730.94387: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853730.94452: Set connection var ansible_timeout to 10 32935 1726853730.94456: Set connection var ansible_shell_type to sh 32935 1726853730.94466: Set connection var ansible_pipelining to False 32935 1726853730.94468: Set connection var ansible_connection to ssh 32935 1726853730.94474: Set connection var ansible_shell_executable to /bin/sh 32935 1726853730.94481: Set connection var ansible_module_compression to ZIP_DEFLATED 32935 1726853730.94501: variable 'ansible_shell_executable' from source: unknown 32935 1726853730.94504: variable 'ansible_connection' from source: unknown 32935 1726853730.94508: variable 'ansible_module_compression' from source: unknown 32935 1726853730.94510: variable 'ansible_shell_type' from source: unknown 32935 1726853730.94513: variable 'ansible_shell_executable' from source: unknown 32935 1726853730.94515: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853730.94517: variable 'ansible_pipelining' from source: unknown 32935 1726853730.94519: variable 'ansible_timeout' from source: unknown 32935 1726853730.94521: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853730.94625: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32935 1726853730.94633: variable 'omit' from source: magic vars 32935 1726853730.94639: starting attempt loop 32935 1726853730.94642: running the handler 32935 1726853730.94738: variable '__network_connections_result' from source: set_fact 32935 1726853730.94786: handler run complete 32935 1726853730.94799: attempt loop complete, returning result 32935 1726853730.94802: _execute() done 32935 1726853730.94805: dumping result to json 32935 1726853730.94809: done dumping result, returning 32935 1726853730.94818: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [02083763-bbaf-84df-441d-000000000028] 32935 1726853730.94822: sending task result for task 02083763-bbaf-84df-441d-000000000028 32935 1726853730.94902: done sending task result for task 02083763-bbaf-84df-441d-000000000028 32935 1726853730.94905: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result.stderr_lines": [ "[005] #0, state:up persistent_state:present, 'lsr101': add connection lsr101, e9b344ac-7aa9-4d34-9c01-f1b4dd46183f", "[006] #1, state:up persistent_state:present, 'lsr101.90': add connection lsr101.90, 84991582-42ea-41a9-ba62-c7b3edc4be1a", "[007] #0, state:up persistent_state:present, 'lsr101': up connection lsr101, e9b344ac-7aa9-4d34-9c01-f1b4dd46183f (not-active)", "[008] #1, state:up persistent_state:present, 'lsr101.90': up connection lsr101.90, 84991582-42ea-41a9-ba62-c7b3edc4be1a (not-active)" ] } 32935 1726853730.94991: no more pending results, returning what we have 32935 1726853730.94994: results queue empty 32935 1726853730.94995: checking for any_errors_fatal 32935 1726853730.95000: done checking for any_errors_fatal 32935 1726853730.95001: checking for max_fail_percentage 32935 1726853730.95002: done checking for max_fail_percentage 32935 1726853730.95002: checking to see if all hosts have failed and the running result is not ok 32935 1726853730.95003: done checking to see if all hosts have failed 32935 1726853730.95004: getting the remaining hosts for this loop 32935 1726853730.95005: done getting the remaining hosts for this loop 32935 1726853730.95009: getting the next task for host managed_node1 32935 1726853730.95015: done getting next task for host managed_node1 32935 1726853730.95018: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 32935 1726853730.95021: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853730.95032: getting variables 32935 1726853730.95034: in VariableManager get_vars() 32935 1726853730.95067: Calling all_inventory to load vars for managed_node1 32935 1726853730.95070: Calling groups_inventory to load vars for managed_node1 32935 1726853730.95073: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853730.95081: Calling all_plugins_play to load vars for managed_node1 32935 1726853730.95084: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853730.95086: Calling groups_plugins_play to load vars for managed_node1 32935 1726853730.95848: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853730.96713: done with get_vars() 32935 1726853730.96729: done getting variables 32935 1726853730.96778: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 13:35:30 -0400 (0:00:00.035) 0:00:16.103 ****** 32935 1726853730.96806: entering _queue_task() for managed_node1/debug 32935 1726853730.97033: worker is 1 (out of 1 available) 32935 1726853730.97048: exiting _queue_task() for managed_node1/debug 32935 1726853730.97060: done queuing things up, now waiting for results queue to drain 32935 1726853730.97062: waiting for pending results... 32935 1726853730.97239: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 32935 1726853730.97331: in run() - task 02083763-bbaf-84df-441d-000000000029 32935 1726853730.97342: variable 'ansible_search_path' from source: unknown 32935 1726853730.97346: variable 'ansible_search_path' from source: unknown 32935 1726853730.97379: calling self._execute() 32935 1726853730.97451: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853730.97458: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853730.97469: variable 'omit' from source: magic vars 32935 1726853730.97748: variable 'ansible_distribution_major_version' from source: facts 32935 1726853730.97755: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853730.97764: variable 'omit' from source: magic vars 32935 1726853730.97801: variable 'omit' from source: magic vars 32935 1726853730.97828: variable 'omit' from source: magic vars 32935 1726853730.97859: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32935 1726853730.97891: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32935 1726853730.97907: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32935 1726853730.97920: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853730.97929: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853730.97954: variable 'inventory_hostname' from source: host vars for 'managed_node1' 32935 1726853730.97959: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853730.97967: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853730.98032: Set connection var ansible_timeout to 10 32935 1726853730.98035: Set connection var ansible_shell_type to sh 32935 1726853730.98044: Set connection var ansible_pipelining to False 32935 1726853730.98046: Set connection var ansible_connection to ssh 32935 1726853730.98049: Set connection var ansible_shell_executable to /bin/sh 32935 1726853730.98055: Set connection var ansible_module_compression to ZIP_DEFLATED 32935 1726853730.98079: variable 'ansible_shell_executable' from source: unknown 32935 1726853730.98082: variable 'ansible_connection' from source: unknown 32935 1726853730.98084: variable 'ansible_module_compression' from source: unknown 32935 1726853730.98087: variable 'ansible_shell_type' from source: unknown 32935 1726853730.98089: variable 'ansible_shell_executable' from source: unknown 32935 1726853730.98091: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853730.98093: variable 'ansible_pipelining' from source: unknown 32935 1726853730.98096: variable 'ansible_timeout' from source: unknown 32935 1726853730.98101: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853730.98206: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32935 1726853730.98215: variable 'omit' from source: magic vars 32935 1726853730.98220: starting attempt loop 32935 1726853730.98223: running the handler 32935 1726853730.98263: variable '__network_connections_result' from source: set_fact 32935 1726853730.98320: variable '__network_connections_result' from source: set_fact 32935 1726853730.98430: handler run complete 32935 1726853730.98452: attempt loop complete, returning result 32935 1726853730.98454: _execute() done 32935 1726853730.98457: dumping result to json 32935 1726853730.98465: done dumping result, returning 32935 1726853730.98474: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [02083763-bbaf-84df-441d-000000000029] 32935 1726853730.98479: sending task result for task 02083763-bbaf-84df-441d-000000000029 32935 1726853730.98565: done sending task result for task 02083763-bbaf-84df-441d-000000000029 32935 1726853730.98568: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": false, "ip": { "auto6": false, "dhcp4": false }, "mtu": 1492, "name": "lsr101", "state": "up", "type": "ethernet" }, { "autoconnect": false, "ip": { "auto6": false, "dhcp4": false }, "mtu": 1280, "name": "lsr101.90", "parent": "lsr101", "state": "up", "type": "vlan", "vlan_id": 90 } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[005] #0, state:up persistent_state:present, 'lsr101': add connection lsr101, e9b344ac-7aa9-4d34-9c01-f1b4dd46183f\n[006] #1, state:up persistent_state:present, 'lsr101.90': add connection lsr101.90, 84991582-42ea-41a9-ba62-c7b3edc4be1a\n[007] #0, state:up persistent_state:present, 'lsr101': up connection lsr101, e9b344ac-7aa9-4d34-9c01-f1b4dd46183f (not-active)\n[008] #1, state:up persistent_state:present, 'lsr101.90': up connection lsr101.90, 84991582-42ea-41a9-ba62-c7b3edc4be1a (not-active)\n", "stderr_lines": [ "[005] #0, state:up persistent_state:present, 'lsr101': add connection lsr101, e9b344ac-7aa9-4d34-9c01-f1b4dd46183f", "[006] #1, state:up persistent_state:present, 'lsr101.90': add connection lsr101.90, 84991582-42ea-41a9-ba62-c7b3edc4be1a", "[007] #0, state:up persistent_state:present, 'lsr101': up connection lsr101, e9b344ac-7aa9-4d34-9c01-f1b4dd46183f (not-active)", "[008] #1, state:up persistent_state:present, 'lsr101.90': up connection lsr101.90, 84991582-42ea-41a9-ba62-c7b3edc4be1a (not-active)" ] } } 32935 1726853730.98683: no more pending results, returning what we have 32935 1726853730.98686: results queue empty 32935 1726853730.98687: checking for any_errors_fatal 32935 1726853730.98691: done checking for any_errors_fatal 32935 1726853730.98692: checking for max_fail_percentage 32935 1726853730.98693: done checking for max_fail_percentage 32935 1726853730.98693: checking to see if all hosts have failed and the running result is not ok 32935 1726853730.98695: done checking to see if all hosts have failed 32935 1726853730.98695: getting the remaining hosts for this loop 32935 1726853730.98697: done getting the remaining hosts for this loop 32935 1726853730.98701: getting the next task for host managed_node1 32935 1726853730.98706: done getting next task for host managed_node1 32935 1726853730.98715: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 32935 1726853730.98717: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853730.98727: getting variables 32935 1726853730.98728: in VariableManager get_vars() 32935 1726853730.98759: Calling all_inventory to load vars for managed_node1 32935 1726853730.98762: Calling groups_inventory to load vars for managed_node1 32935 1726853730.98763: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853730.98792: Calling all_plugins_play to load vars for managed_node1 32935 1726853730.98795: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853730.98798: Calling groups_plugins_play to load vars for managed_node1 32935 1726853730.99697: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853731.00576: done with get_vars() 32935 1726853731.00593: done getting variables 32935 1726853731.00649: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 13:35:31 -0400 (0:00:00.038) 0:00:16.142 ****** 32935 1726853731.00675: entering _queue_task() for managed_node1/debug 32935 1726853731.00923: worker is 1 (out of 1 available) 32935 1726853731.00938: exiting _queue_task() for managed_node1/debug 32935 1726853731.00949: done queuing things up, now waiting for results queue to drain 32935 1726853731.00951: waiting for pending results... 32935 1726853731.01204: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 32935 1726853731.01273: in run() - task 02083763-bbaf-84df-441d-00000000002a 32935 1726853731.01277: variable 'ansible_search_path' from source: unknown 32935 1726853731.01280: variable 'ansible_search_path' from source: unknown 32935 1726853731.01389: calling self._execute() 32935 1726853731.01405: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853731.01413: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853731.01421: variable 'omit' from source: magic vars 32935 1726853731.01809: variable 'ansible_distribution_major_version' from source: facts 32935 1726853731.01852: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853731.01964: variable 'network_state' from source: role '' defaults 32935 1726853731.01974: Evaluated conditional (network_state != {}): False 32935 1726853731.01977: when evaluation is False, skipping this task 32935 1726853731.01980: _execute() done 32935 1726853731.01982: dumping result to json 32935 1726853731.01985: done dumping result, returning 32935 1726853731.01987: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [02083763-bbaf-84df-441d-00000000002a] 32935 1726853731.01990: sending task result for task 02083763-bbaf-84df-441d-00000000002a skipping: [managed_node1] => { "false_condition": "network_state != {}" } 32935 1726853731.02219: no more pending results, returning what we have 32935 1726853731.02222: results queue empty 32935 1726853731.02223: checking for any_errors_fatal 32935 1726853731.02230: done checking for any_errors_fatal 32935 1726853731.02231: checking for max_fail_percentage 32935 1726853731.02233: done checking for max_fail_percentage 32935 1726853731.02233: checking to see if all hosts have failed and the running result is not ok 32935 1726853731.02235: done checking to see if all hosts have failed 32935 1726853731.02236: getting the remaining hosts for this loop 32935 1726853731.02237: done getting the remaining hosts for this loop 32935 1726853731.02241: getting the next task for host managed_node1 32935 1726853731.02247: done getting next task for host managed_node1 32935 1726853731.02250: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 32935 1726853731.02253: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853731.02265: getting variables 32935 1726853731.02267: in VariableManager get_vars() 32935 1726853731.02331: Calling all_inventory to load vars for managed_node1 32935 1726853731.02334: Calling groups_inventory to load vars for managed_node1 32935 1726853731.02336: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853731.02341: done sending task result for task 02083763-bbaf-84df-441d-00000000002a 32935 1726853731.02343: WORKER PROCESS EXITING 32935 1726853731.02351: Calling all_plugins_play to load vars for managed_node1 32935 1726853731.02353: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853731.02355: Calling groups_plugins_play to load vars for managed_node1 32935 1726853731.03242: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853731.04102: done with get_vars() 32935 1726853731.04120: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 13:35:31 -0400 (0:00:00.035) 0:00:16.177 ****** 32935 1726853731.04191: entering _queue_task() for managed_node1/ping 32935 1726853731.04192: Creating lock for ping 32935 1726853731.04441: worker is 1 (out of 1 available) 32935 1726853731.04455: exiting _queue_task() for managed_node1/ping 32935 1726853731.04468: done queuing things up, now waiting for results queue to drain 32935 1726853731.04470: waiting for pending results... 32935 1726853731.04647: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 32935 1726853731.04741: in run() - task 02083763-bbaf-84df-441d-00000000002b 32935 1726853731.04753: variable 'ansible_search_path' from source: unknown 32935 1726853731.04756: variable 'ansible_search_path' from source: unknown 32935 1726853731.04788: calling self._execute() 32935 1726853731.04857: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853731.04865: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853731.04874: variable 'omit' from source: magic vars 32935 1726853731.05152: variable 'ansible_distribution_major_version' from source: facts 32935 1726853731.05164: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853731.05170: variable 'omit' from source: magic vars 32935 1726853731.05208: variable 'omit' from source: magic vars 32935 1726853731.05233: variable 'omit' from source: magic vars 32935 1726853731.05270: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32935 1726853731.05299: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32935 1726853731.05316: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32935 1726853731.05329: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853731.05339: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853731.05367: variable 'inventory_hostname' from source: host vars for 'managed_node1' 32935 1726853731.05373: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853731.05375: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853731.05442: Set connection var ansible_timeout to 10 32935 1726853731.05448: Set connection var ansible_shell_type to sh 32935 1726853731.05459: Set connection var ansible_pipelining to False 32935 1726853731.05462: Set connection var ansible_connection to ssh 32935 1726853731.05464: Set connection var ansible_shell_executable to /bin/sh 32935 1726853731.05473: Set connection var ansible_module_compression to ZIP_DEFLATED 32935 1726853731.05490: variable 'ansible_shell_executable' from source: unknown 32935 1726853731.05493: variable 'ansible_connection' from source: unknown 32935 1726853731.05496: variable 'ansible_module_compression' from source: unknown 32935 1726853731.05498: variable 'ansible_shell_type' from source: unknown 32935 1726853731.05500: variable 'ansible_shell_executable' from source: unknown 32935 1726853731.05503: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853731.05507: variable 'ansible_pipelining' from source: unknown 32935 1726853731.05510: variable 'ansible_timeout' from source: unknown 32935 1726853731.05514: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853731.05661: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 32935 1726853731.05676: variable 'omit' from source: magic vars 32935 1726853731.05679: starting attempt loop 32935 1726853731.05683: running the handler 32935 1726853731.05694: _low_level_execute_command(): starting 32935 1726853731.05701: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32935 1726853731.06210: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853731.06215: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853731.06218: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853731.06220: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853731.06268: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853731.06282: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853731.06286: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853731.06335: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853731.08025: stdout chunk (state=3): >>>/root <<< 32935 1726853731.08123: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853731.08155: stderr chunk (state=3): >>><<< 32935 1726853731.08159: stdout chunk (state=3): >>><<< 32935 1726853731.08186: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853731.08197: _low_level_execute_command(): starting 32935 1726853731.08202: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853731.0818536-33737-28449326334021 `" && echo ansible-tmp-1726853731.0818536-33737-28449326334021="` echo /root/.ansible/tmp/ansible-tmp-1726853731.0818536-33737-28449326334021 `" ) && sleep 0' 32935 1726853731.08638: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32935 1726853731.08642: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853731.08651: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853731.08654: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853731.08704: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853731.08714: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853731.08720: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853731.08750: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853731.10646: stdout chunk (state=3): >>>ansible-tmp-1726853731.0818536-33737-28449326334021=/root/.ansible/tmp/ansible-tmp-1726853731.0818536-33737-28449326334021 <<< 32935 1726853731.10744: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853731.10775: stderr chunk (state=3): >>><<< 32935 1726853731.10778: stdout chunk (state=3): >>><<< 32935 1726853731.10794: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853731.0818536-33737-28449326334021=/root/.ansible/tmp/ansible-tmp-1726853731.0818536-33737-28449326334021 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853731.10831: variable 'ansible_module_compression' from source: unknown 32935 1726853731.10869: ANSIBALLZ: Using lock for ping 32935 1726853731.10875: ANSIBALLZ: Acquiring lock 32935 1726853731.10877: ANSIBALLZ: Lock acquired: 140683289725552 32935 1726853731.10880: ANSIBALLZ: Creating module 32935 1726853731.18092: ANSIBALLZ: Writing module into payload 32935 1726853731.18130: ANSIBALLZ: Writing module 32935 1726853731.18148: ANSIBALLZ: Renaming module 32935 1726853731.18154: ANSIBALLZ: Done creating module 32935 1726853731.18174: variable 'ansible_facts' from source: unknown 32935 1726853731.18217: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853731.0818536-33737-28449326334021/AnsiballZ_ping.py 32935 1726853731.18321: Sending initial data 32935 1726853731.18324: Sent initial data (152 bytes) 32935 1726853731.18776: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853731.18779: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 32935 1726853731.18782: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853731.18784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853731.18828: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853731.18841: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853731.18893: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853731.20509: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 32935 1726853731.20516: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32935 1726853731.20548: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32935 1726853731.20587: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32935vj31k4ae/tmptaqfr0kr /root/.ansible/tmp/ansible-tmp-1726853731.0818536-33737-28449326334021/AnsiballZ_ping.py <<< 32935 1726853731.20590: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853731.0818536-33737-28449326334021/AnsiballZ_ping.py" <<< 32935 1726853731.20623: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-32935vj31k4ae/tmptaqfr0kr" to remote "/root/.ansible/tmp/ansible-tmp-1726853731.0818536-33737-28449326334021/AnsiballZ_ping.py" <<< 32935 1726853731.20632: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853731.0818536-33737-28449326334021/AnsiballZ_ping.py" <<< 32935 1726853731.21122: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853731.21165: stderr chunk (state=3): >>><<< 32935 1726853731.21168: stdout chunk (state=3): >>><<< 32935 1726853731.21214: done transferring module to remote 32935 1726853731.21223: _low_level_execute_command(): starting 32935 1726853731.21227: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853731.0818536-33737-28449326334021/ /root/.ansible/tmp/ansible-tmp-1726853731.0818536-33737-28449326334021/AnsiballZ_ping.py && sleep 0' 32935 1726853731.21650: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853731.21657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32935 1726853731.21677: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 32935 1726853731.21695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 32935 1726853731.21698: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853731.21748: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853731.21755: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853731.21795: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853731.23544: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853731.23572: stderr chunk (state=3): >>><<< 32935 1726853731.23575: stdout chunk (state=3): >>><<< 32935 1726853731.23590: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853731.23594: _low_level_execute_command(): starting 32935 1726853731.23596: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853731.0818536-33737-28449326334021/AnsiballZ_ping.py && sleep 0' 32935 1726853731.24036: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853731.24039: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 32935 1726853731.24041: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853731.24043: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853731.24045: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853731.24105: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853731.24110: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853731.24148: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853731.39067: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 32935 1726853731.40277: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 32935 1726853731.40305: stderr chunk (state=3): >>><<< 32935 1726853731.40308: stdout chunk (state=3): >>><<< 32935 1726853731.40325: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 32935 1726853731.40346: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853731.0818536-33737-28449326334021/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32935 1726853731.40354: _low_level_execute_command(): starting 32935 1726853731.40359: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853731.0818536-33737-28449326334021/ > /dev/null 2>&1 && sleep 0' 32935 1726853731.40817: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853731.40820: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 32935 1726853731.40822: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 32935 1726853731.40824: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853731.40826: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853731.40876: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853731.40892: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853731.40929: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853731.42734: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853731.42761: stderr chunk (state=3): >>><<< 32935 1726853731.42766: stdout chunk (state=3): >>><<< 32935 1726853731.42779: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853731.42785: handler run complete 32935 1726853731.42797: attempt loop complete, returning result 32935 1726853731.42800: _execute() done 32935 1726853731.42804: dumping result to json 32935 1726853731.42806: done dumping result, returning 32935 1726853731.42815: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [02083763-bbaf-84df-441d-00000000002b] 32935 1726853731.42817: sending task result for task 02083763-bbaf-84df-441d-00000000002b 32935 1726853731.42903: done sending task result for task 02083763-bbaf-84df-441d-00000000002b 32935 1726853731.42906: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "ping": "pong" } 32935 1726853731.42965: no more pending results, returning what we have 32935 1726853731.42968: results queue empty 32935 1726853731.42969: checking for any_errors_fatal 32935 1726853731.42976: done checking for any_errors_fatal 32935 1726853731.42977: checking for max_fail_percentage 32935 1726853731.42978: done checking for max_fail_percentage 32935 1726853731.42979: checking to see if all hosts have failed and the running result is not ok 32935 1726853731.42980: done checking to see if all hosts have failed 32935 1726853731.42981: getting the remaining hosts for this loop 32935 1726853731.42982: done getting the remaining hosts for this loop 32935 1726853731.42986: getting the next task for host managed_node1 32935 1726853731.42996: done getting next task for host managed_node1 32935 1726853731.42998: ^ task is: TASK: meta (role_complete) 32935 1726853731.43001: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853731.43012: getting variables 32935 1726853731.43014: in VariableManager get_vars() 32935 1726853731.43056: Calling all_inventory to load vars for managed_node1 32935 1726853731.43061: Calling groups_inventory to load vars for managed_node1 32935 1726853731.43063: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853731.43080: Calling all_plugins_play to load vars for managed_node1 32935 1726853731.43083: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853731.43086: Calling groups_plugins_play to load vars for managed_node1 32935 1726853731.44024: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853731.44878: done with get_vars() 32935 1726853731.44894: done getting variables 32935 1726853731.44952: done queuing things up, now waiting for results queue to drain 32935 1726853731.44954: results queue empty 32935 1726853731.44954: checking for any_errors_fatal 32935 1726853731.44956: done checking for any_errors_fatal 32935 1726853731.44956: checking for max_fail_percentage 32935 1726853731.44960: done checking for max_fail_percentage 32935 1726853731.44960: checking to see if all hosts have failed and the running result is not ok 32935 1726853731.44961: done checking to see if all hosts have failed 32935 1726853731.44961: getting the remaining hosts for this loop 32935 1726853731.44962: done getting the remaining hosts for this loop 32935 1726853731.44963: getting the next task for host managed_node1 32935 1726853731.44966: done getting next task for host managed_node1 32935 1726853731.44968: ^ task is: TASK: Include the task 'assert_device_present.yml' 32935 1726853731.44969: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853731.44972: getting variables 32935 1726853731.44973: in VariableManager get_vars() 32935 1726853731.44982: Calling all_inventory to load vars for managed_node1 32935 1726853731.44984: Calling groups_inventory to load vars for managed_node1 32935 1726853731.44985: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853731.44988: Calling all_plugins_play to load vars for managed_node1 32935 1726853731.44989: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853731.44991: Calling groups_plugins_play to load vars for managed_node1 32935 1726853731.45624: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853731.46477: done with get_vars() 32935 1726853731.46491: done getting variables TASK [Include the task 'assert_device_present.yml'] **************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_vlan_mtu.yml:46 Friday 20 September 2024 13:35:31 -0400 (0:00:00.423) 0:00:16.601 ****** 32935 1726853731.46541: entering _queue_task() for managed_node1/include_tasks 32935 1726853731.46802: worker is 1 (out of 1 available) 32935 1726853731.46815: exiting _queue_task() for managed_node1/include_tasks 32935 1726853731.46831: done queuing things up, now waiting for results queue to drain 32935 1726853731.46833: waiting for pending results... 32935 1726853731.47002: running TaskExecutor() for managed_node1/TASK: Include the task 'assert_device_present.yml' 32935 1726853731.47068: in run() - task 02083763-bbaf-84df-441d-00000000005b 32935 1726853731.47079: variable 'ansible_search_path' from source: unknown 32935 1726853731.47109: calling self._execute() 32935 1726853731.47184: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853731.47190: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853731.47199: variable 'omit' from source: magic vars 32935 1726853731.47481: variable 'ansible_distribution_major_version' from source: facts 32935 1726853731.47492: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853731.47496: _execute() done 32935 1726853731.47500: dumping result to json 32935 1726853731.47509: done dumping result, returning 32935 1726853731.47513: done running TaskExecutor() for managed_node1/TASK: Include the task 'assert_device_present.yml' [02083763-bbaf-84df-441d-00000000005b] 32935 1726853731.47515: sending task result for task 02083763-bbaf-84df-441d-00000000005b 32935 1726853731.47603: done sending task result for task 02083763-bbaf-84df-441d-00000000005b 32935 1726853731.47606: WORKER PROCESS EXITING 32935 1726853731.47636: no more pending results, returning what we have 32935 1726853731.47641: in VariableManager get_vars() 32935 1726853731.47692: Calling all_inventory to load vars for managed_node1 32935 1726853731.47695: Calling groups_inventory to load vars for managed_node1 32935 1726853731.47697: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853731.47709: Calling all_plugins_play to load vars for managed_node1 32935 1726853731.47711: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853731.47714: Calling groups_plugins_play to load vars for managed_node1 32935 1726853731.48585: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853731.49438: done with get_vars() 32935 1726853731.49452: variable 'ansible_search_path' from source: unknown 32935 1726853731.49465: we have included files to process 32935 1726853731.49466: generating all_blocks data 32935 1726853731.49467: done generating all_blocks data 32935 1726853731.49473: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 32935 1726853731.49474: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 32935 1726853731.49476: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 32935 1726853731.49545: in VariableManager get_vars() 32935 1726853731.49563: done with get_vars() 32935 1726853731.49640: done processing included file 32935 1726853731.49641: iterating over new_blocks loaded from include file 32935 1726853731.49642: in VariableManager get_vars() 32935 1726853731.49654: done with get_vars() 32935 1726853731.49655: filtering new block on tags 32935 1726853731.49669: done filtering new block on tags 32935 1726853731.49673: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed_node1 32935 1726853731.49676: extending task lists for all hosts with included blocks 32935 1726853731.51042: done extending task lists 32935 1726853731.51044: done processing included files 32935 1726853731.51044: results queue empty 32935 1726853731.51045: checking for any_errors_fatal 32935 1726853731.51046: done checking for any_errors_fatal 32935 1726853731.51046: checking for max_fail_percentage 32935 1726853731.51047: done checking for max_fail_percentage 32935 1726853731.51047: checking to see if all hosts have failed and the running result is not ok 32935 1726853731.51048: done checking to see if all hosts have failed 32935 1726853731.51048: getting the remaining hosts for this loop 32935 1726853731.51049: done getting the remaining hosts for this loop 32935 1726853731.51051: getting the next task for host managed_node1 32935 1726853731.51053: done getting next task for host managed_node1 32935 1726853731.51055: ^ task is: TASK: Include the task 'get_interface_stat.yml' 32935 1726853731.51056: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853731.51060: getting variables 32935 1726853731.51061: in VariableManager get_vars() 32935 1726853731.51073: Calling all_inventory to load vars for managed_node1 32935 1726853731.51075: Calling groups_inventory to load vars for managed_node1 32935 1726853731.51076: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853731.51081: Calling all_plugins_play to load vars for managed_node1 32935 1726853731.51082: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853731.51084: Calling groups_plugins_play to load vars for managed_node1 32935 1726853731.51760: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853731.52610: done with get_vars() 32935 1726853731.52623: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 13:35:31 -0400 (0:00:00.061) 0:00:16.662 ****** 32935 1726853731.52683: entering _queue_task() for managed_node1/include_tasks 32935 1726853731.52945: worker is 1 (out of 1 available) 32935 1726853731.52963: exiting _queue_task() for managed_node1/include_tasks 32935 1726853731.52978: done queuing things up, now waiting for results queue to drain 32935 1726853731.52980: waiting for pending results... 32935 1726853731.53151: running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' 32935 1726853731.53221: in run() - task 02083763-bbaf-84df-441d-000000000578 32935 1726853731.53230: variable 'ansible_search_path' from source: unknown 32935 1726853731.53234: variable 'ansible_search_path' from source: unknown 32935 1726853731.53264: calling self._execute() 32935 1726853731.53338: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853731.53342: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853731.53351: variable 'omit' from source: magic vars 32935 1726853731.53632: variable 'ansible_distribution_major_version' from source: facts 32935 1726853731.53642: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853731.53653: _execute() done 32935 1726853731.53656: dumping result to json 32935 1726853731.53658: done dumping result, returning 32935 1726853731.53668: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' [02083763-bbaf-84df-441d-000000000578] 32935 1726853731.53672: sending task result for task 02083763-bbaf-84df-441d-000000000578 32935 1726853731.53749: done sending task result for task 02083763-bbaf-84df-441d-000000000578 32935 1726853731.53751: WORKER PROCESS EXITING 32935 1726853731.53782: no more pending results, returning what we have 32935 1726853731.53786: in VariableManager get_vars() 32935 1726853731.53828: Calling all_inventory to load vars for managed_node1 32935 1726853731.53831: Calling groups_inventory to load vars for managed_node1 32935 1726853731.53833: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853731.53847: Calling all_plugins_play to load vars for managed_node1 32935 1726853731.53849: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853731.53852: Calling groups_plugins_play to load vars for managed_node1 32935 1726853731.54625: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853731.55472: done with get_vars() 32935 1726853731.55484: variable 'ansible_search_path' from source: unknown 32935 1726853731.55485: variable 'ansible_search_path' from source: unknown 32935 1726853731.55511: we have included files to process 32935 1726853731.55512: generating all_blocks data 32935 1726853731.55513: done generating all_blocks data 32935 1726853731.55514: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 32935 1726853731.55514: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 32935 1726853731.55516: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 32935 1726853731.55635: done processing included file 32935 1726853731.55637: iterating over new_blocks loaded from include file 32935 1726853731.55638: in VariableManager get_vars() 32935 1726853731.55649: done with get_vars() 32935 1726853731.55650: filtering new block on tags 32935 1726853731.55659: done filtering new block on tags 32935 1726853731.55661: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node1 32935 1726853731.55664: extending task lists for all hosts with included blocks 32935 1726853731.55725: done extending task lists 32935 1726853731.55726: done processing included files 32935 1726853731.55726: results queue empty 32935 1726853731.55726: checking for any_errors_fatal 32935 1726853731.55729: done checking for any_errors_fatal 32935 1726853731.55729: checking for max_fail_percentage 32935 1726853731.55730: done checking for max_fail_percentage 32935 1726853731.55730: checking to see if all hosts have failed and the running result is not ok 32935 1726853731.55731: done checking to see if all hosts have failed 32935 1726853731.55731: getting the remaining hosts for this loop 32935 1726853731.55732: done getting the remaining hosts for this loop 32935 1726853731.55734: getting the next task for host managed_node1 32935 1726853731.55736: done getting next task for host managed_node1 32935 1726853731.55738: ^ task is: TASK: Get stat for interface {{ interface }} 32935 1726853731.55740: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853731.55741: getting variables 32935 1726853731.55742: in VariableManager get_vars() 32935 1726853731.55750: Calling all_inventory to load vars for managed_node1 32935 1726853731.55752: Calling groups_inventory to load vars for managed_node1 32935 1726853731.55753: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853731.55756: Calling all_plugins_play to load vars for managed_node1 32935 1726853731.55758: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853731.55760: Calling groups_plugins_play to load vars for managed_node1 32935 1726853731.59469: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853731.60309: done with get_vars() 32935 1726853731.60325: done getting variables 32935 1726853731.60427: variable 'interface' from source: include params 32935 1726853731.60430: variable 'vlan_interface' from source: play vars 32935 1726853731.60473: variable 'vlan_interface' from source: play vars TASK [Get stat for interface lsr101.90] **************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 13:35:31 -0400 (0:00:00.078) 0:00:16.740 ****** 32935 1726853731.60492: entering _queue_task() for managed_node1/stat 32935 1726853731.60755: worker is 1 (out of 1 available) 32935 1726853731.60768: exiting _queue_task() for managed_node1/stat 32935 1726853731.60784: done queuing things up, now waiting for results queue to drain 32935 1726853731.60786: waiting for pending results... 32935 1726853731.60969: running TaskExecutor() for managed_node1/TASK: Get stat for interface lsr101.90 32935 1726853731.61052: in run() - task 02083763-bbaf-84df-441d-00000000069c 32935 1726853731.61064: variable 'ansible_search_path' from source: unknown 32935 1726853731.61067: variable 'ansible_search_path' from source: unknown 32935 1726853731.61096: calling self._execute() 32935 1726853731.61174: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853731.61178: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853731.61187: variable 'omit' from source: magic vars 32935 1726853731.61452: variable 'ansible_distribution_major_version' from source: facts 32935 1726853731.61463: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853731.61466: variable 'omit' from source: magic vars 32935 1726853731.61497: variable 'omit' from source: magic vars 32935 1726853731.61566: variable 'interface' from source: include params 32935 1726853731.61569: variable 'vlan_interface' from source: play vars 32935 1726853731.61612: variable 'vlan_interface' from source: play vars 32935 1726853731.61626: variable 'omit' from source: magic vars 32935 1726853731.61666: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32935 1726853731.61691: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32935 1726853731.61709: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32935 1726853731.61722: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853731.61731: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853731.61754: variable 'inventory_hostname' from source: host vars for 'managed_node1' 32935 1726853731.61760: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853731.61763: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853731.61832: Set connection var ansible_timeout to 10 32935 1726853731.61837: Set connection var ansible_shell_type to sh 32935 1726853731.61844: Set connection var ansible_pipelining to False 32935 1726853731.61847: Set connection var ansible_connection to ssh 32935 1726853731.61852: Set connection var ansible_shell_executable to /bin/sh 32935 1726853731.61857: Set connection var ansible_module_compression to ZIP_DEFLATED 32935 1726853731.61877: variable 'ansible_shell_executable' from source: unknown 32935 1726853731.61882: variable 'ansible_connection' from source: unknown 32935 1726853731.61885: variable 'ansible_module_compression' from source: unknown 32935 1726853731.61888: variable 'ansible_shell_type' from source: unknown 32935 1726853731.61890: variable 'ansible_shell_executable' from source: unknown 32935 1726853731.61893: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853731.61897: variable 'ansible_pipelining' from source: unknown 32935 1726853731.61900: variable 'ansible_timeout' from source: unknown 32935 1726853731.61902: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853731.62047: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 32935 1726853731.62055: variable 'omit' from source: magic vars 32935 1726853731.62062: starting attempt loop 32935 1726853731.62065: running the handler 32935 1726853731.62077: _low_level_execute_command(): starting 32935 1726853731.62086: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32935 1726853731.62608: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853731.62611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 32935 1726853731.62614: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 32935 1726853731.62618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853731.62664: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853731.62670: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853731.62674: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853731.62727: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853731.64411: stdout chunk (state=3): >>>/root <<< 32935 1726853731.64512: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853731.64543: stderr chunk (state=3): >>><<< 32935 1726853731.64546: stdout chunk (state=3): >>><<< 32935 1726853731.64569: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853731.64583: _low_level_execute_command(): starting 32935 1726853731.64588: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853731.6456914-33747-98548569928309 `" && echo ansible-tmp-1726853731.6456914-33747-98548569928309="` echo /root/.ansible/tmp/ansible-tmp-1726853731.6456914-33747-98548569928309 `" ) && sleep 0' 32935 1726853731.65038: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853731.65042: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 32935 1726853731.65051: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853731.65054: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853731.65057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853731.65105: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853731.65112: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853731.65114: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853731.65154: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853731.67132: stdout chunk (state=3): >>>ansible-tmp-1726853731.6456914-33747-98548569928309=/root/.ansible/tmp/ansible-tmp-1726853731.6456914-33747-98548569928309 <<< 32935 1726853731.67226: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853731.67240: stderr chunk (state=3): >>><<< 32935 1726853731.67250: stdout chunk (state=3): >>><<< 32935 1726853731.67283: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853731.6456914-33747-98548569928309=/root/.ansible/tmp/ansible-tmp-1726853731.6456914-33747-98548569928309 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853731.67406: variable 'ansible_module_compression' from source: unknown 32935 1726853731.67412: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-32935vj31k4ae/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 32935 1726853731.67461: variable 'ansible_facts' from source: unknown 32935 1726853731.67573: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853731.6456914-33747-98548569928309/AnsiballZ_stat.py 32935 1726853731.67754: Sending initial data 32935 1726853731.67761: Sent initial data (152 bytes) 32935 1726853731.68501: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853731.68505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853731.68517: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853731.68535: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853731.68605: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853731.70118: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32935 1726853731.70178: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32935 1726853731.70251: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32935vj31k4ae/tmp90b0mjlf /root/.ansible/tmp/ansible-tmp-1726853731.6456914-33747-98548569928309/AnsiballZ_stat.py <<< 32935 1726853731.70255: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853731.6456914-33747-98548569928309/AnsiballZ_stat.py" <<< 32935 1726853731.70287: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-32935vj31k4ae/tmp90b0mjlf" to remote "/root/.ansible/tmp/ansible-tmp-1726853731.6456914-33747-98548569928309/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853731.6456914-33747-98548569928309/AnsiballZ_stat.py" <<< 32935 1726853731.70812: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853731.70852: stderr chunk (state=3): >>><<< 32935 1726853731.70855: stdout chunk (state=3): >>><<< 32935 1726853731.70875: done transferring module to remote 32935 1726853731.70885: _low_level_execute_command(): starting 32935 1726853731.70890: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853731.6456914-33747-98548569928309/ /root/.ansible/tmp/ansible-tmp-1726853731.6456914-33747-98548569928309/AnsiballZ_stat.py && sleep 0' 32935 1726853731.71326: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853731.71329: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 32935 1726853731.71332: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853731.71335: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853731.71341: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853731.71390: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853731.71393: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853731.71438: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853731.73260: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853731.73288: stderr chunk (state=3): >>><<< 32935 1726853731.73291: stdout chunk (state=3): >>><<< 32935 1726853731.73304: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853731.73307: _low_level_execute_command(): starting 32935 1726853731.73312: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853731.6456914-33747-98548569928309/AnsiballZ_stat.py && sleep 0' 32935 1726853731.73751: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853731.73755: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 32935 1726853731.73758: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 32935 1726853731.73761: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853731.73763: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853731.73816: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853731.73823: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853731.73825: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853731.73867: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853731.88919: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/lsr101.90", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 30755, "dev": 23, "nlink": 1, "atime": 1726853730.7923887, "mtime": 1726853730.7923887, "ctime": 1726853730.7923887, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/lsr101.90", "lnk_target": "../../devices/virtual/net/lsr101.90", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/lsr101.90", "follow": false, "checksum_algorithm": "sha1"}}} <<< 32935 1726853731.90334: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 32935 1726853731.90367: stderr chunk (state=3): >>><<< 32935 1726853731.90375: stdout chunk (state=3): >>><<< 32935 1726853731.90543: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/lsr101.90", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 30755, "dev": 23, "nlink": 1, "atime": 1726853730.7923887, "mtime": 1726853730.7923887, "ctime": 1726853730.7923887, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/lsr101.90", "lnk_target": "../../devices/virtual/net/lsr101.90", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/lsr101.90", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 32935 1726853731.90547: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/lsr101.90', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853731.6456914-33747-98548569928309/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32935 1726853731.90554: _low_level_execute_command(): starting 32935 1726853731.90556: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853731.6456914-33747-98548569928309/ > /dev/null 2>&1 && sleep 0' 32935 1726853731.91185: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 32935 1726853731.91200: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address <<< 32935 1726853731.91231: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 32935 1726853731.91246: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853731.91335: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853731.91360: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853731.91430: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853731.93284: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853731.93295: stdout chunk (state=3): >>><<< 32935 1726853731.93318: stderr chunk (state=3): >>><<< 32935 1726853731.93477: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853731.93480: handler run complete 32935 1726853731.93482: attempt loop complete, returning result 32935 1726853731.93484: _execute() done 32935 1726853731.93485: dumping result to json 32935 1726853731.93487: done dumping result, returning 32935 1726853731.93489: done running TaskExecutor() for managed_node1/TASK: Get stat for interface lsr101.90 [02083763-bbaf-84df-441d-00000000069c] 32935 1726853731.93491: sending task result for task 02083763-bbaf-84df-441d-00000000069c 32935 1726853731.93568: done sending task result for task 02083763-bbaf-84df-441d-00000000069c 32935 1726853731.93579: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "atime": 1726853730.7923887, "block_size": 4096, "blocks": 0, "ctime": 1726853730.7923887, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 30755, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/lsr101.90", "lnk_target": "../../devices/virtual/net/lsr101.90", "mode": "0777", "mtime": 1726853730.7923887, "nlink": 1, "path": "/sys/class/net/lsr101.90", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 32935 1726853731.93880: no more pending results, returning what we have 32935 1726853731.93884: results queue empty 32935 1726853731.93886: checking for any_errors_fatal 32935 1726853731.93887: done checking for any_errors_fatal 32935 1726853731.93888: checking for max_fail_percentage 32935 1726853731.93889: done checking for max_fail_percentage 32935 1726853731.93890: checking to see if all hosts have failed and the running result is not ok 32935 1726853731.93891: done checking to see if all hosts have failed 32935 1726853731.93892: getting the remaining hosts for this loop 32935 1726853731.93894: done getting the remaining hosts for this loop 32935 1726853731.93897: getting the next task for host managed_node1 32935 1726853731.93905: done getting next task for host managed_node1 32935 1726853731.93908: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 32935 1726853731.93910: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853731.93915: getting variables 32935 1726853731.93916: in VariableManager get_vars() 32935 1726853731.93957: Calling all_inventory to load vars for managed_node1 32935 1726853731.93963: Calling groups_inventory to load vars for managed_node1 32935 1726853731.93966: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853731.93993: Calling all_plugins_play to load vars for managed_node1 32935 1726853731.93996: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853731.94000: Calling groups_plugins_play to load vars for managed_node1 32935 1726853731.95551: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853731.97407: done with get_vars() 32935 1726853731.97428: done getting variables 32935 1726853731.97501: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 32935 1726853731.97627: variable 'interface' from source: include params 32935 1726853731.97631: variable 'vlan_interface' from source: play vars 32935 1726853731.97700: variable 'vlan_interface' from source: play vars TASK [Assert that the interface is present - 'lsr101.90'] ********************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 13:35:31 -0400 (0:00:00.372) 0:00:17.112 ****** 32935 1726853731.97736: entering _queue_task() for managed_node1/assert 32935 1726853731.98103: worker is 1 (out of 1 available) 32935 1726853731.98232: exiting _queue_task() for managed_node1/assert 32935 1726853731.98244: done queuing things up, now waiting for results queue to drain 32935 1726853731.98246: waiting for pending results... 32935 1726853731.98432: running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'lsr101.90' 32935 1726853731.98569: in run() - task 02083763-bbaf-84df-441d-000000000579 32935 1726853731.98592: variable 'ansible_search_path' from source: unknown 32935 1726853731.98601: variable 'ansible_search_path' from source: unknown 32935 1726853731.98640: calling self._execute() 32935 1726853731.98779: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853731.98783: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853731.98786: variable 'omit' from source: magic vars 32935 1726853731.99177: variable 'ansible_distribution_major_version' from source: facts 32935 1726853731.99195: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853731.99226: variable 'omit' from source: magic vars 32935 1726853731.99262: variable 'omit' from source: magic vars 32935 1726853731.99433: variable 'interface' from source: include params 32935 1726853731.99439: variable 'vlan_interface' from source: play vars 32935 1726853731.99461: variable 'vlan_interface' from source: play vars 32935 1726853731.99488: variable 'omit' from source: magic vars 32935 1726853731.99541: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32935 1726853731.99591: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32935 1726853731.99619: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32935 1726853731.99662: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853731.99677: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853731.99761: variable 'inventory_hostname' from source: host vars for 'managed_node1' 32935 1726853731.99765: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853731.99767: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853731.99845: Set connection var ansible_timeout to 10 32935 1726853731.99857: Set connection var ansible_shell_type to sh 32935 1726853731.99887: Set connection var ansible_pipelining to False 32935 1726853731.99895: Set connection var ansible_connection to ssh 32935 1726853731.99905: Set connection var ansible_shell_executable to /bin/sh 32935 1726853731.99980: Set connection var ansible_module_compression to ZIP_DEFLATED 32935 1726853731.99983: variable 'ansible_shell_executable' from source: unknown 32935 1726853731.99985: variable 'ansible_connection' from source: unknown 32935 1726853731.99988: variable 'ansible_module_compression' from source: unknown 32935 1726853731.99990: variable 'ansible_shell_type' from source: unknown 32935 1726853731.99992: variable 'ansible_shell_executable' from source: unknown 32935 1726853731.99994: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853731.99996: variable 'ansible_pipelining' from source: unknown 32935 1726853731.99999: variable 'ansible_timeout' from source: unknown 32935 1726853732.00001: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853732.00146: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32935 1726853732.00168: variable 'omit' from source: magic vars 32935 1726853732.00182: starting attempt loop 32935 1726853732.00224: running the handler 32935 1726853732.00350: variable 'interface_stat' from source: set_fact 32935 1726853732.00381: Evaluated conditional (interface_stat.stat.exists): True 32935 1726853732.00391: handler run complete 32935 1726853732.00417: attempt loop complete, returning result 32935 1726853732.00441: _execute() done 32935 1726853732.00444: dumping result to json 32935 1726853732.00446: done dumping result, returning 32935 1726853732.00476: done running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'lsr101.90' [02083763-bbaf-84df-441d-000000000579] 32935 1726853732.00479: sending task result for task 02083763-bbaf-84df-441d-000000000579 32935 1726853732.00720: done sending task result for task 02083763-bbaf-84df-441d-000000000579 32935 1726853732.00723: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 32935 1726853732.00782: no more pending results, returning what we have 32935 1726853732.00786: results queue empty 32935 1726853732.00788: checking for any_errors_fatal 32935 1726853732.00799: done checking for any_errors_fatal 32935 1726853732.00800: checking for max_fail_percentage 32935 1726853732.00802: done checking for max_fail_percentage 32935 1726853732.00803: checking to see if all hosts have failed and the running result is not ok 32935 1726853732.00804: done checking to see if all hosts have failed 32935 1726853732.00804: getting the remaining hosts for this loop 32935 1726853732.00806: done getting the remaining hosts for this loop 32935 1726853732.00809: getting the next task for host managed_node1 32935 1726853732.00818: done getting next task for host managed_node1 32935 1726853732.00821: ^ task is: TASK: Include the task 'assert_profile_present.yml' 32935 1726853732.00823: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853732.00828: getting variables 32935 1726853732.00830: in VariableManager get_vars() 32935 1726853732.00883: Calling all_inventory to load vars for managed_node1 32935 1726853732.00886: Calling groups_inventory to load vars for managed_node1 32935 1726853732.00889: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853732.00901: Calling all_plugins_play to load vars for managed_node1 32935 1726853732.00903: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853732.00906: Calling groups_plugins_play to load vars for managed_node1 32935 1726853732.02573: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853732.04318: done with get_vars() 32935 1726853732.04349: done getting variables TASK [Include the task 'assert_profile_present.yml'] *************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_vlan_mtu.yml:50 Friday 20 September 2024 13:35:32 -0400 (0:00:00.067) 0:00:17.180 ****** 32935 1726853732.04448: entering _queue_task() for managed_node1/include_tasks 32935 1726853732.04835: worker is 1 (out of 1 available) 32935 1726853732.04848: exiting _queue_task() for managed_node1/include_tasks 32935 1726853732.04979: done queuing things up, now waiting for results queue to drain 32935 1726853732.04982: waiting for pending results... 32935 1726853732.05218: running TaskExecutor() for managed_node1/TASK: Include the task 'assert_profile_present.yml' 32935 1726853732.05315: in run() - task 02083763-bbaf-84df-441d-00000000005c 32935 1726853732.05318: variable 'ansible_search_path' from source: unknown 32935 1726853732.05418: variable 'interface' from source: play vars 32935 1726853732.05587: variable 'interface' from source: play vars 32935 1726853732.05609: variable 'vlan_interface' from source: play vars 32935 1726853732.05694: variable 'vlan_interface' from source: play vars 32935 1726853732.05745: variable 'omit' from source: magic vars 32935 1726853732.05890: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853732.05907: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853732.05965: variable 'omit' from source: magic vars 32935 1726853732.06210: variable 'ansible_distribution_major_version' from source: facts 32935 1726853732.06226: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853732.06289: variable 'item' from source: unknown 32935 1726853732.06338: variable 'item' from source: unknown 32935 1726853732.06788: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853732.06791: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853732.06793: variable 'omit' from source: magic vars 32935 1726853732.06795: variable 'ansible_distribution_major_version' from source: facts 32935 1726853732.06797: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853732.06799: variable 'item' from source: unknown 32935 1726853732.06852: variable 'item' from source: unknown 32935 1726853732.07048: dumping result to json 32935 1726853732.07051: done dumping result, returning 32935 1726853732.07054: done running TaskExecutor() for managed_node1/TASK: Include the task 'assert_profile_present.yml' [02083763-bbaf-84df-441d-00000000005c] 32935 1726853732.07056: sending task result for task 02083763-bbaf-84df-441d-00000000005c 32935 1726853732.07104: done sending task result for task 02083763-bbaf-84df-441d-00000000005c 32935 1726853732.07107: WORKER PROCESS EXITING 32935 1726853732.07182: no more pending results, returning what we have 32935 1726853732.07187: in VariableManager get_vars() 32935 1726853732.07242: Calling all_inventory to load vars for managed_node1 32935 1726853732.07246: Calling groups_inventory to load vars for managed_node1 32935 1726853732.07248: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853732.07265: Calling all_plugins_play to load vars for managed_node1 32935 1726853732.07268: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853732.07274: Calling groups_plugins_play to load vars for managed_node1 32935 1726853732.09107: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853732.10775: done with get_vars() 32935 1726853732.10796: variable 'ansible_search_path' from source: unknown 32935 1726853732.10813: variable 'ansible_search_path' from source: unknown 32935 1726853732.10821: we have included files to process 32935 1726853732.10829: generating all_blocks data 32935 1726853732.10831: done generating all_blocks data 32935 1726853732.10835: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 32935 1726853732.10836: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 32935 1726853732.10839: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 32935 1726853732.11054: in VariableManager get_vars() 32935 1726853732.11083: done with get_vars() 32935 1726853732.11354: done processing included file 32935 1726853732.11356: iterating over new_blocks loaded from include file 32935 1726853732.11357: in VariableManager get_vars() 32935 1726853732.11387: done with get_vars() 32935 1726853732.11389: filtering new block on tags 32935 1726853732.11409: done filtering new block on tags 32935 1726853732.11412: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node1 => (item=lsr101) 32935 1726853732.11417: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 32935 1726853732.11418: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 32935 1726853732.11421: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 32935 1726853732.11529: in VariableManager get_vars() 32935 1726853732.11550: done with get_vars() 32935 1726853732.11790: done processing included file 32935 1726853732.11792: iterating over new_blocks loaded from include file 32935 1726853732.11793: in VariableManager get_vars() 32935 1726853732.11816: done with get_vars() 32935 1726853732.11818: filtering new block on tags 32935 1726853732.11834: done filtering new block on tags 32935 1726853732.11837: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node1 => (item=lsr101.90) 32935 1726853732.11841: extending task lists for all hosts with included blocks 32935 1726853732.13939: done extending task lists 32935 1726853732.13940: done processing included files 32935 1726853732.13941: results queue empty 32935 1726853732.13941: checking for any_errors_fatal 32935 1726853732.13944: done checking for any_errors_fatal 32935 1726853732.13945: checking for max_fail_percentage 32935 1726853732.13945: done checking for max_fail_percentage 32935 1726853732.13946: checking to see if all hosts have failed and the running result is not ok 32935 1726853732.13946: done checking to see if all hosts have failed 32935 1726853732.13947: getting the remaining hosts for this loop 32935 1726853732.13948: done getting the remaining hosts for this loop 32935 1726853732.13949: getting the next task for host managed_node1 32935 1726853732.13952: done getting next task for host managed_node1 32935 1726853732.13954: ^ task is: TASK: Include the task 'get_profile_stat.yml' 32935 1726853732.13955: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853732.13957: getting variables 32935 1726853732.13960: in VariableManager get_vars() 32935 1726853732.13973: Calling all_inventory to load vars for managed_node1 32935 1726853732.13975: Calling groups_inventory to load vars for managed_node1 32935 1726853732.13976: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853732.13981: Calling all_plugins_play to load vars for managed_node1 32935 1726853732.13983: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853732.13991: Calling groups_plugins_play to load vars for managed_node1 32935 1726853732.14643: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853732.16048: done with get_vars() 32935 1726853732.16074: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 13:35:32 -0400 (0:00:00.116) 0:00:17.296 ****** 32935 1726853732.16135: entering _queue_task() for managed_node1/include_tasks 32935 1726853732.16454: worker is 1 (out of 1 available) 32935 1726853732.16469: exiting _queue_task() for managed_node1/include_tasks 32935 1726853732.16485: done queuing things up, now waiting for results queue to drain 32935 1726853732.16487: waiting for pending results... 32935 1726853732.16719: running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' 32935 1726853732.16879: in run() - task 02083763-bbaf-84df-441d-0000000006b8 32935 1726853732.16883: variable 'ansible_search_path' from source: unknown 32935 1726853732.16886: variable 'ansible_search_path' from source: unknown 32935 1726853732.16911: calling self._execute() 32935 1726853732.17013: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853732.17076: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853732.17080: variable 'omit' from source: magic vars 32935 1726853732.17454: variable 'ansible_distribution_major_version' from source: facts 32935 1726853732.17476: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853732.17488: _execute() done 32935 1726853732.17497: dumping result to json 32935 1726853732.17505: done dumping result, returning 32935 1726853732.17516: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' [02083763-bbaf-84df-441d-0000000006b8] 32935 1726853732.17526: sending task result for task 02083763-bbaf-84df-441d-0000000006b8 32935 1726853732.17632: done sending task result for task 02083763-bbaf-84df-441d-0000000006b8 32935 1726853732.17635: WORKER PROCESS EXITING 32935 1726853732.17682: no more pending results, returning what we have 32935 1726853732.17687: in VariableManager get_vars() 32935 1726853732.17731: Calling all_inventory to load vars for managed_node1 32935 1726853732.17734: Calling groups_inventory to load vars for managed_node1 32935 1726853732.17737: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853732.17750: Calling all_plugins_play to load vars for managed_node1 32935 1726853732.17752: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853732.17755: Calling groups_plugins_play to load vars for managed_node1 32935 1726853732.18660: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853732.19587: done with get_vars() 32935 1726853732.19604: variable 'ansible_search_path' from source: unknown 32935 1726853732.19605: variable 'ansible_search_path' from source: unknown 32935 1726853732.19639: we have included files to process 32935 1726853732.19640: generating all_blocks data 32935 1726853732.19641: done generating all_blocks data 32935 1726853732.19642: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 32935 1726853732.19643: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 32935 1726853732.19645: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 32935 1726853732.20643: done processing included file 32935 1726853732.20644: iterating over new_blocks loaded from include file 32935 1726853732.20645: in VariableManager get_vars() 32935 1726853732.20662: done with get_vars() 32935 1726853732.20664: filtering new block on tags 32935 1726853732.20679: done filtering new block on tags 32935 1726853732.20681: in VariableManager get_vars() 32935 1726853732.20692: done with get_vars() 32935 1726853732.20693: filtering new block on tags 32935 1726853732.20706: done filtering new block on tags 32935 1726853732.20707: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node1 32935 1726853732.20710: extending task lists for all hosts with included blocks 32935 1726853732.20813: done extending task lists 32935 1726853732.20814: done processing included files 32935 1726853732.20815: results queue empty 32935 1726853732.20815: checking for any_errors_fatal 32935 1726853732.20817: done checking for any_errors_fatal 32935 1726853732.20818: checking for max_fail_percentage 32935 1726853732.20818: done checking for max_fail_percentage 32935 1726853732.20819: checking to see if all hosts have failed and the running result is not ok 32935 1726853732.20819: done checking to see if all hosts have failed 32935 1726853732.20820: getting the remaining hosts for this loop 32935 1726853732.20820: done getting the remaining hosts for this loop 32935 1726853732.20822: getting the next task for host managed_node1 32935 1726853732.20825: done getting next task for host managed_node1 32935 1726853732.20826: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 32935 1726853732.20828: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853732.20829: getting variables 32935 1726853732.20830: in VariableManager get_vars() 32935 1726853732.20882: Calling all_inventory to load vars for managed_node1 32935 1726853732.20884: Calling groups_inventory to load vars for managed_node1 32935 1726853732.20885: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853732.20889: Calling all_plugins_play to load vars for managed_node1 32935 1726853732.20890: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853732.20892: Calling groups_plugins_play to load vars for managed_node1 32935 1726853732.21553: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853732.22404: done with get_vars() 32935 1726853732.22420: done getting variables 32935 1726853732.22446: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 13:35:32 -0400 (0:00:00.063) 0:00:17.360 ****** 32935 1726853732.22469: entering _queue_task() for managed_node1/set_fact 32935 1726853732.22723: worker is 1 (out of 1 available) 32935 1726853732.22737: exiting _queue_task() for managed_node1/set_fact 32935 1726853732.22751: done queuing things up, now waiting for results queue to drain 32935 1726853732.22753: waiting for pending results... 32935 1726853732.22920: running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag 32935 1726853732.22989: in run() - task 02083763-bbaf-84df-441d-0000000007f0 32935 1726853732.23000: variable 'ansible_search_path' from source: unknown 32935 1726853732.23004: variable 'ansible_search_path' from source: unknown 32935 1726853732.23031: calling self._execute() 32935 1726853732.23108: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853732.23112: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853732.23122: variable 'omit' from source: magic vars 32935 1726853732.23406: variable 'ansible_distribution_major_version' from source: facts 32935 1726853732.23421: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853732.23424: variable 'omit' from source: magic vars 32935 1726853732.23455: variable 'omit' from source: magic vars 32935 1726853732.23484: variable 'omit' from source: magic vars 32935 1726853732.23516: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32935 1726853732.23546: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32935 1726853732.23564: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32935 1726853732.23580: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853732.23589: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853732.23612: variable 'inventory_hostname' from source: host vars for 'managed_node1' 32935 1726853732.23615: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853732.23620: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853732.23692: Set connection var ansible_timeout to 10 32935 1726853732.23696: Set connection var ansible_shell_type to sh 32935 1726853732.23703: Set connection var ansible_pipelining to False 32935 1726853732.23705: Set connection var ansible_connection to ssh 32935 1726853732.23710: Set connection var ansible_shell_executable to /bin/sh 32935 1726853732.23715: Set connection var ansible_module_compression to ZIP_DEFLATED 32935 1726853732.23740: variable 'ansible_shell_executable' from source: unknown 32935 1726853732.23744: variable 'ansible_connection' from source: unknown 32935 1726853732.23747: variable 'ansible_module_compression' from source: unknown 32935 1726853732.23749: variable 'ansible_shell_type' from source: unknown 32935 1726853732.23752: variable 'ansible_shell_executable' from source: unknown 32935 1726853732.23754: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853732.23757: variable 'ansible_pipelining' from source: unknown 32935 1726853732.23761: variable 'ansible_timeout' from source: unknown 32935 1726853732.23764: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853732.23856: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32935 1726853732.23866: variable 'omit' from source: magic vars 32935 1726853732.23873: starting attempt loop 32935 1726853732.23875: running the handler 32935 1726853732.23887: handler run complete 32935 1726853732.23896: attempt loop complete, returning result 32935 1726853732.23898: _execute() done 32935 1726853732.23901: dumping result to json 32935 1726853732.23903: done dumping result, returning 32935 1726853732.23910: done running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag [02083763-bbaf-84df-441d-0000000007f0] 32935 1726853732.23915: sending task result for task 02083763-bbaf-84df-441d-0000000007f0 32935 1726853732.23992: done sending task result for task 02083763-bbaf-84df-441d-0000000007f0 32935 1726853732.23995: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 32935 1726853732.24049: no more pending results, returning what we have 32935 1726853732.24052: results queue empty 32935 1726853732.24053: checking for any_errors_fatal 32935 1726853732.24054: done checking for any_errors_fatal 32935 1726853732.24055: checking for max_fail_percentage 32935 1726853732.24056: done checking for max_fail_percentage 32935 1726853732.24060: checking to see if all hosts have failed and the running result is not ok 32935 1726853732.24061: done checking to see if all hosts have failed 32935 1726853732.24062: getting the remaining hosts for this loop 32935 1726853732.24063: done getting the remaining hosts for this loop 32935 1726853732.24066: getting the next task for host managed_node1 32935 1726853732.24076: done getting next task for host managed_node1 32935 1726853732.24078: ^ task is: TASK: Stat profile file 32935 1726853732.24081: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853732.24086: getting variables 32935 1726853732.24087: in VariableManager get_vars() 32935 1726853732.24133: Calling all_inventory to load vars for managed_node1 32935 1726853732.24135: Calling groups_inventory to load vars for managed_node1 32935 1726853732.24137: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853732.24147: Calling all_plugins_play to load vars for managed_node1 32935 1726853732.24149: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853732.24152: Calling groups_plugins_play to load vars for managed_node1 32935 1726853732.24928: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853732.25799: done with get_vars() 32935 1726853732.25814: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 13:35:32 -0400 (0:00:00.034) 0:00:17.394 ****** 32935 1726853732.25885: entering _queue_task() for managed_node1/stat 32935 1726853732.26118: worker is 1 (out of 1 available) 32935 1726853732.26133: exiting _queue_task() for managed_node1/stat 32935 1726853732.26146: done queuing things up, now waiting for results queue to drain 32935 1726853732.26148: waiting for pending results... 32935 1726853732.26319: running TaskExecutor() for managed_node1/TASK: Stat profile file 32935 1726853732.26393: in run() - task 02083763-bbaf-84df-441d-0000000007f1 32935 1726853732.26405: variable 'ansible_search_path' from source: unknown 32935 1726853732.26408: variable 'ansible_search_path' from source: unknown 32935 1726853732.26436: calling self._execute() 32935 1726853732.26516: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853732.26522: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853732.26531: variable 'omit' from source: magic vars 32935 1726853732.26821: variable 'ansible_distribution_major_version' from source: facts 32935 1726853732.26828: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853732.26834: variable 'omit' from source: magic vars 32935 1726853732.26869: variable 'omit' from source: magic vars 32935 1726853732.26942: variable 'profile' from source: include params 32935 1726853732.26945: variable 'item' from source: include params 32935 1726853732.26995: variable 'item' from source: include params 32935 1726853732.27009: variable 'omit' from source: magic vars 32935 1726853732.27044: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32935 1726853732.27075: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32935 1726853732.27092: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32935 1726853732.27105: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853732.27114: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853732.27139: variable 'inventory_hostname' from source: host vars for 'managed_node1' 32935 1726853732.27143: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853732.27145: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853732.27216: Set connection var ansible_timeout to 10 32935 1726853732.27220: Set connection var ansible_shell_type to sh 32935 1726853732.27227: Set connection var ansible_pipelining to False 32935 1726853732.27230: Set connection var ansible_connection to ssh 32935 1726853732.27234: Set connection var ansible_shell_executable to /bin/sh 32935 1726853732.27241: Set connection var ansible_module_compression to ZIP_DEFLATED 32935 1726853732.27262: variable 'ansible_shell_executable' from source: unknown 32935 1726853732.27265: variable 'ansible_connection' from source: unknown 32935 1726853732.27267: variable 'ansible_module_compression' from source: unknown 32935 1726853732.27270: variable 'ansible_shell_type' from source: unknown 32935 1726853732.27275: variable 'ansible_shell_executable' from source: unknown 32935 1726853732.27277: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853732.27279: variable 'ansible_pipelining' from source: unknown 32935 1726853732.27282: variable 'ansible_timeout' from source: unknown 32935 1726853732.27284: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853732.27431: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 32935 1726853732.27440: variable 'omit' from source: magic vars 32935 1726853732.27446: starting attempt loop 32935 1726853732.27449: running the handler 32935 1726853732.27461: _low_level_execute_command(): starting 32935 1726853732.27473: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32935 1726853732.27963: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853732.27992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853732.27996: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853732.27999: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853732.28048: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853732.28051: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853732.28053: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853732.28108: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853732.29795: stdout chunk (state=3): >>>/root <<< 32935 1726853732.29898: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853732.29931: stderr chunk (state=3): >>><<< 32935 1726853732.29935: stdout chunk (state=3): >>><<< 32935 1726853732.29953: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853732.29967: _low_level_execute_command(): starting 32935 1726853732.29974: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853732.299534-33774-131264182474214 `" && echo ansible-tmp-1726853732.299534-33774-131264182474214="` echo /root/.ansible/tmp/ansible-tmp-1726853732.299534-33774-131264182474214 `" ) && sleep 0' 32935 1726853732.30409: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853732.30412: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32935 1726853732.30423: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853732.30425: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853732.30427: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853732.30476: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853732.30485: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853732.30521: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853732.32402: stdout chunk (state=3): >>>ansible-tmp-1726853732.299534-33774-131264182474214=/root/.ansible/tmp/ansible-tmp-1726853732.299534-33774-131264182474214 <<< 32935 1726853732.32506: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853732.32534: stderr chunk (state=3): >>><<< 32935 1726853732.32537: stdout chunk (state=3): >>><<< 32935 1726853732.32553: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853732.299534-33774-131264182474214=/root/.ansible/tmp/ansible-tmp-1726853732.299534-33774-131264182474214 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853732.32595: variable 'ansible_module_compression' from source: unknown 32935 1726853732.32644: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-32935vj31k4ae/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 32935 1726853732.32676: variable 'ansible_facts' from source: unknown 32935 1726853732.32740: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853732.299534-33774-131264182474214/AnsiballZ_stat.py 32935 1726853732.32843: Sending initial data 32935 1726853732.32847: Sent initial data (152 bytes) 32935 1726853732.33303: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32935 1726853732.33306: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853732.33308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 32935 1726853732.33310: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853732.33312: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853732.33314: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32935 1726853732.33316: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853732.33365: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853732.33369: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853732.33379: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853732.33414: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853732.34938: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 32935 1726853732.34945: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32935 1726853732.34976: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32935 1726853732.35014: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32935vj31k4ae/tmp6a73763x /root/.ansible/tmp/ansible-tmp-1726853732.299534-33774-131264182474214/AnsiballZ_stat.py <<< 32935 1726853732.35022: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853732.299534-33774-131264182474214/AnsiballZ_stat.py" <<< 32935 1726853732.35056: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-32935vj31k4ae/tmp6a73763x" to remote "/root/.ansible/tmp/ansible-tmp-1726853732.299534-33774-131264182474214/AnsiballZ_stat.py" <<< 32935 1726853732.35060: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853732.299534-33774-131264182474214/AnsiballZ_stat.py" <<< 32935 1726853732.35593: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853732.35633: stderr chunk (state=3): >>><<< 32935 1726853732.35636: stdout chunk (state=3): >>><<< 32935 1726853732.35683: done transferring module to remote 32935 1726853732.35692: _low_level_execute_command(): starting 32935 1726853732.35696: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853732.299534-33774-131264182474214/ /root/.ansible/tmp/ansible-tmp-1726853732.299534-33774-131264182474214/AnsiballZ_stat.py && sleep 0' 32935 1726853732.36141: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853732.36144: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 32935 1726853732.36150: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 32935 1726853732.36153: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32935 1726853732.36155: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853732.36209: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853732.36212: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853732.36247: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853732.37959: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853732.37990: stderr chunk (state=3): >>><<< 32935 1726853732.37993: stdout chunk (state=3): >>><<< 32935 1726853732.38007: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853732.38010: _low_level_execute_command(): starting 32935 1726853732.38016: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853732.299534-33774-131264182474214/AnsiballZ_stat.py && sleep 0' 32935 1726853732.38430: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853732.38461: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32935 1726853732.38464: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 32935 1726853732.38466: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853732.38469: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853732.38473: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 32935 1726853732.38476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853732.38528: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853732.38533: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853732.38536: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853732.38579: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853732.53813: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-lsr101", "follow": false, "checksum_algorithm": "sha1"}}} <<< 32935 1726853732.55306: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 32935 1726853732.55310: stdout chunk (state=3): >>><<< 32935 1726853732.55312: stderr chunk (state=3): >>><<< 32935 1726853732.55315: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-lsr101", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 32935 1726853732.55318: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-lsr101', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853732.299534-33774-131264182474214/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32935 1726853732.55320: _low_level_execute_command(): starting 32935 1726853732.55322: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853732.299534-33774-131264182474214/ > /dev/null 2>&1 && sleep 0' 32935 1726853732.55922: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853732.55926: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32935 1726853732.55960: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853732.55963: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853732.55966: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853732.56022: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853732.56026: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853732.56028: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853732.56094: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853732.57937: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853732.57964: stderr chunk (state=3): >>><<< 32935 1726853732.57967: stdout chunk (state=3): >>><<< 32935 1726853732.57983: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853732.57989: handler run complete 32935 1726853732.58005: attempt loop complete, returning result 32935 1726853732.58008: _execute() done 32935 1726853732.58010: dumping result to json 32935 1726853732.58013: done dumping result, returning 32935 1726853732.58021: done running TaskExecutor() for managed_node1/TASK: Stat profile file [02083763-bbaf-84df-441d-0000000007f1] 32935 1726853732.58025: sending task result for task 02083763-bbaf-84df-441d-0000000007f1 32935 1726853732.58117: done sending task result for task 02083763-bbaf-84df-441d-0000000007f1 32935 1726853732.58120: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 32935 1726853732.58174: no more pending results, returning what we have 32935 1726853732.58177: results queue empty 32935 1726853732.58178: checking for any_errors_fatal 32935 1726853732.58185: done checking for any_errors_fatal 32935 1726853732.58186: checking for max_fail_percentage 32935 1726853732.58188: done checking for max_fail_percentage 32935 1726853732.58189: checking to see if all hosts have failed and the running result is not ok 32935 1726853732.58190: done checking to see if all hosts have failed 32935 1726853732.58191: getting the remaining hosts for this loop 32935 1726853732.58192: done getting the remaining hosts for this loop 32935 1726853732.58195: getting the next task for host managed_node1 32935 1726853732.58203: done getting next task for host managed_node1 32935 1726853732.58205: ^ task is: TASK: Set NM profile exist flag based on the profile files 32935 1726853732.58208: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853732.58212: getting variables 32935 1726853732.58214: in VariableManager get_vars() 32935 1726853732.58254: Calling all_inventory to load vars for managed_node1 32935 1726853732.58257: Calling groups_inventory to load vars for managed_node1 32935 1726853732.58260: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853732.58272: Calling all_plugins_play to load vars for managed_node1 32935 1726853732.58275: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853732.58278: Calling groups_plugins_play to load vars for managed_node1 32935 1726853732.59428: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853732.60455: done with get_vars() 32935 1726853732.60475: done getting variables 32935 1726853732.60517: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 13:35:32 -0400 (0:00:00.346) 0:00:17.741 ****** 32935 1726853732.60540: entering _queue_task() for managed_node1/set_fact 32935 1726853732.60778: worker is 1 (out of 1 available) 32935 1726853732.60792: exiting _queue_task() for managed_node1/set_fact 32935 1726853732.60807: done queuing things up, now waiting for results queue to drain 32935 1726853732.60808: waiting for pending results... 32935 1726853732.60982: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files 32935 1726853732.61056: in run() - task 02083763-bbaf-84df-441d-0000000007f2 32935 1726853732.61069: variable 'ansible_search_path' from source: unknown 32935 1726853732.61075: variable 'ansible_search_path' from source: unknown 32935 1726853732.61103: calling self._execute() 32935 1726853732.61184: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853732.61187: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853732.61197: variable 'omit' from source: magic vars 32935 1726853732.61481: variable 'ansible_distribution_major_version' from source: facts 32935 1726853732.61491: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853732.61574: variable 'profile_stat' from source: set_fact 32935 1726853732.61586: Evaluated conditional (profile_stat.stat.exists): False 32935 1726853732.61589: when evaluation is False, skipping this task 32935 1726853732.61592: _execute() done 32935 1726853732.61595: dumping result to json 32935 1726853732.61597: done dumping result, returning 32935 1726853732.61604: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files [02083763-bbaf-84df-441d-0000000007f2] 32935 1726853732.61610: sending task result for task 02083763-bbaf-84df-441d-0000000007f2 32935 1726853732.61688: done sending task result for task 02083763-bbaf-84df-441d-0000000007f2 32935 1726853732.61691: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 32935 1726853732.61737: no more pending results, returning what we have 32935 1726853732.61741: results queue empty 32935 1726853732.61742: checking for any_errors_fatal 32935 1726853732.61750: done checking for any_errors_fatal 32935 1726853732.61751: checking for max_fail_percentage 32935 1726853732.61753: done checking for max_fail_percentage 32935 1726853732.61754: checking to see if all hosts have failed and the running result is not ok 32935 1726853732.61755: done checking to see if all hosts have failed 32935 1726853732.61756: getting the remaining hosts for this loop 32935 1726853732.61757: done getting the remaining hosts for this loop 32935 1726853732.61761: getting the next task for host managed_node1 32935 1726853732.61769: done getting next task for host managed_node1 32935 1726853732.61773: ^ task is: TASK: Get NM profile info 32935 1726853732.61777: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853732.61781: getting variables 32935 1726853732.61783: in VariableManager get_vars() 32935 1726853732.61827: Calling all_inventory to load vars for managed_node1 32935 1726853732.61830: Calling groups_inventory to load vars for managed_node1 32935 1726853732.61832: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853732.61844: Calling all_plugins_play to load vars for managed_node1 32935 1726853732.61847: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853732.61849: Calling groups_plugins_play to load vars for managed_node1 32935 1726853732.62603: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853732.63452: done with get_vars() 32935 1726853732.63468: done getting variables 32935 1726853732.63534: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 13:35:32 -0400 (0:00:00.030) 0:00:17.771 ****** 32935 1726853732.63556: entering _queue_task() for managed_node1/shell 32935 1726853732.63557: Creating lock for shell 32935 1726853732.63778: worker is 1 (out of 1 available) 32935 1726853732.63792: exiting _queue_task() for managed_node1/shell 32935 1726853732.63805: done queuing things up, now waiting for results queue to drain 32935 1726853732.63807: waiting for pending results... 32935 1726853732.63975: running TaskExecutor() for managed_node1/TASK: Get NM profile info 32935 1726853732.64056: in run() - task 02083763-bbaf-84df-441d-0000000007f3 32935 1726853732.64067: variable 'ansible_search_path' from source: unknown 32935 1726853732.64072: variable 'ansible_search_path' from source: unknown 32935 1726853732.64106: calling self._execute() 32935 1726853732.64182: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853732.64188: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853732.64198: variable 'omit' from source: magic vars 32935 1726853732.64476: variable 'ansible_distribution_major_version' from source: facts 32935 1726853732.64488: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853732.64492: variable 'omit' from source: magic vars 32935 1726853732.64522: variable 'omit' from source: magic vars 32935 1726853732.64595: variable 'profile' from source: include params 32935 1726853732.64598: variable 'item' from source: include params 32935 1726853732.64644: variable 'item' from source: include params 32935 1726853732.64668: variable 'omit' from source: magic vars 32935 1726853732.64704: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32935 1726853732.64731: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32935 1726853732.64747: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32935 1726853732.64763: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853732.64770: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853732.64796: variable 'inventory_hostname' from source: host vars for 'managed_node1' 32935 1726853732.64799: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853732.64803: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853732.64870: Set connection var ansible_timeout to 10 32935 1726853732.64876: Set connection var ansible_shell_type to sh 32935 1726853732.64883: Set connection var ansible_pipelining to False 32935 1726853732.64886: Set connection var ansible_connection to ssh 32935 1726853732.64891: Set connection var ansible_shell_executable to /bin/sh 32935 1726853732.64896: Set connection var ansible_module_compression to ZIP_DEFLATED 32935 1726853732.64915: variable 'ansible_shell_executable' from source: unknown 32935 1726853732.64918: variable 'ansible_connection' from source: unknown 32935 1726853732.64921: variable 'ansible_module_compression' from source: unknown 32935 1726853732.64923: variable 'ansible_shell_type' from source: unknown 32935 1726853732.64925: variable 'ansible_shell_executable' from source: unknown 32935 1726853732.64927: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853732.64932: variable 'ansible_pipelining' from source: unknown 32935 1726853732.64935: variable 'ansible_timeout' from source: unknown 32935 1726853732.64937: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853732.65042: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32935 1726853732.65046: variable 'omit' from source: magic vars 32935 1726853732.65051: starting attempt loop 32935 1726853732.65054: running the handler 32935 1726853732.65063: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32935 1726853732.65080: _low_level_execute_command(): starting 32935 1726853732.65087: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32935 1726853732.65741: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853732.65767: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32935 1726853732.65770: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853732.65793: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853732.65796: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853732.65839: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853732.65842: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853732.65848: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853732.65903: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853732.67593: stdout chunk (state=3): >>>/root <<< 32935 1726853732.67673: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853732.67718: stderr chunk (state=3): >>><<< 32935 1726853732.67721: stdout chunk (state=3): >>><<< 32935 1726853732.67836: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853732.67840: _low_level_execute_command(): starting 32935 1726853732.67843: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853732.677438-33791-212997234514909 `" && echo ansible-tmp-1726853732.677438-33791-212997234514909="` echo /root/.ansible/tmp/ansible-tmp-1726853732.677438-33791-212997234514909 `" ) && sleep 0' 32935 1726853732.68396: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32935 1726853732.68420: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853732.68434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853732.68450: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32935 1726853732.68467: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 32935 1726853732.68530: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853732.68586: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853732.68607: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853732.68682: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853732.70620: stdout chunk (state=3): >>>ansible-tmp-1726853732.677438-33791-212997234514909=/root/.ansible/tmp/ansible-tmp-1726853732.677438-33791-212997234514909 <<< 32935 1726853732.70795: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853732.70799: stdout chunk (state=3): >>><<< 32935 1726853732.70808: stderr chunk (state=3): >>><<< 32935 1726853732.70827: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853732.677438-33791-212997234514909=/root/.ansible/tmp/ansible-tmp-1726853732.677438-33791-212997234514909 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853732.70917: variable 'ansible_module_compression' from source: unknown 32935 1726853732.70936: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-32935vj31k4ae/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 32935 1726853732.70986: variable 'ansible_facts' from source: unknown 32935 1726853732.71093: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853732.677438-33791-212997234514909/AnsiballZ_command.py 32935 1726853732.71266: Sending initial data 32935 1726853732.71269: Sent initial data (155 bytes) 32935 1726853732.71987: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853732.72031: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853732.72046: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853732.72067: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853732.72235: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853732.73783: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32935 1726853732.73929: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853732.677438-33791-212997234514909/AnsiballZ_command.py" <<< 32935 1726853732.73933: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32935vj31k4ae/tmp9pwz1inh /root/.ansible/tmp/ansible-tmp-1726853732.677438-33791-212997234514909/AnsiballZ_command.py <<< 32935 1726853732.73981: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-32935vj31k4ae/tmp9pwz1inh" to remote "/root/.ansible/tmp/ansible-tmp-1726853732.677438-33791-212997234514909/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853732.677438-33791-212997234514909/AnsiballZ_command.py" <<< 32935 1726853732.75527: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853732.75531: stdout chunk (state=3): >>><<< 32935 1726853732.75533: stderr chunk (state=3): >>><<< 32935 1726853732.75535: done transferring module to remote 32935 1726853732.75537: _low_level_execute_command(): starting 32935 1726853732.75540: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853732.677438-33791-212997234514909/ /root/.ansible/tmp/ansible-tmp-1726853732.677438-33791-212997234514909/AnsiballZ_command.py && sleep 0' 32935 1726853732.76164: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32935 1726853732.76186: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 32935 1726853732.76290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853732.76310: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853732.76382: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853732.78234: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853732.78253: stdout chunk (state=3): >>><<< 32935 1726853732.78269: stderr chunk (state=3): >>><<< 32935 1726853732.78295: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853732.78303: _low_level_execute_command(): starting 32935 1726853732.78313: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853732.677438-33791-212997234514909/AnsiballZ_command.py && sleep 0' 32935 1726853732.78941: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32935 1726853732.78956: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853732.78969: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853732.78991: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32935 1726853732.79008: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 32935 1726853732.79041: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853732.79121: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853732.79150: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853732.79227: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853732.96257: stdout chunk (state=3): >>> {"changed": true, "stdout": "lsr101 /etc/NetworkManager/system-connections/lsr101.nmconnection \nlsr101.90 /etc/NetworkManager/system-connections/lsr101.90.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep lsr101 | grep /etc", "start": "2024-09-20 13:35:32.941519", "end": "2024-09-20 13:35:32.960918", "delta": "0:00:00.019399", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep lsr101 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 32935 1726853732.97878: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 32935 1726853732.97882: stdout chunk (state=3): >>><<< 32935 1726853732.97884: stderr chunk (state=3): >>><<< 32935 1726853732.97887: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "lsr101 /etc/NetworkManager/system-connections/lsr101.nmconnection \nlsr101.90 /etc/NetworkManager/system-connections/lsr101.90.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep lsr101 | grep /etc", "start": "2024-09-20 13:35:32.941519", "end": "2024-09-20 13:35:32.960918", "delta": "0:00:00.019399", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep lsr101 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 32935 1726853732.97890: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep lsr101 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853732.677438-33791-212997234514909/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32935 1726853732.97892: _low_level_execute_command(): starting 32935 1726853732.97894: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853732.677438-33791-212997234514909/ > /dev/null 2>&1 && sleep 0' 32935 1726853732.98512: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32935 1726853732.98520: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853732.98530: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853732.98590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853732.98641: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853732.98653: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853732.98675: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853732.98747: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853733.00681: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853733.00685: stderr chunk (state=3): >>><<< 32935 1726853733.00688: stdout chunk (state=3): >>><<< 32935 1726853733.00690: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853733.00693: handler run complete 32935 1726853733.00695: Evaluated conditional (False): False 32935 1726853733.00711: attempt loop complete, returning result 32935 1726853733.00714: _execute() done 32935 1726853733.00717: dumping result to json 32935 1726853733.00722: done dumping result, returning 32935 1726853733.00730: done running TaskExecutor() for managed_node1/TASK: Get NM profile info [02083763-bbaf-84df-441d-0000000007f3] 32935 1726853733.00735: sending task result for task 02083763-bbaf-84df-441d-0000000007f3 32935 1726853733.00841: done sending task result for task 02083763-bbaf-84df-441d-0000000007f3 32935 1726853733.00845: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep lsr101 | grep /etc", "delta": "0:00:00.019399", "end": "2024-09-20 13:35:32.960918", "rc": 0, "start": "2024-09-20 13:35:32.941519" } STDOUT: lsr101 /etc/NetworkManager/system-connections/lsr101.nmconnection lsr101.90 /etc/NetworkManager/system-connections/lsr101.90.nmconnection 32935 1726853733.01049: no more pending results, returning what we have 32935 1726853733.01053: results queue empty 32935 1726853733.01054: checking for any_errors_fatal 32935 1726853733.01065: done checking for any_errors_fatal 32935 1726853733.01066: checking for max_fail_percentage 32935 1726853733.01068: done checking for max_fail_percentage 32935 1726853733.01069: checking to see if all hosts have failed and the running result is not ok 32935 1726853733.01070: done checking to see if all hosts have failed 32935 1726853733.01080: getting the remaining hosts for this loop 32935 1726853733.01082: done getting the remaining hosts for this loop 32935 1726853733.01085: getting the next task for host managed_node1 32935 1726853733.01093: done getting next task for host managed_node1 32935 1726853733.01096: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 32935 1726853733.01100: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853733.01104: getting variables 32935 1726853733.01106: in VariableManager get_vars() 32935 1726853733.01149: Calling all_inventory to load vars for managed_node1 32935 1726853733.01152: Calling groups_inventory to load vars for managed_node1 32935 1726853733.01155: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853733.01297: Calling all_plugins_play to load vars for managed_node1 32935 1726853733.01303: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853733.01307: Calling groups_plugins_play to load vars for managed_node1 32935 1726853733.02970: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853733.04994: done with get_vars() 32935 1726853733.05021: done getting variables 32935 1726853733.05094: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 13:35:33 -0400 (0:00:00.415) 0:00:18.186 ****** 32935 1726853733.05124: entering _queue_task() for managed_node1/set_fact 32935 1726853733.05509: worker is 1 (out of 1 available) 32935 1726853733.05523: exiting _queue_task() for managed_node1/set_fact 32935 1726853733.05535: done queuing things up, now waiting for results queue to drain 32935 1726853733.05537: waiting for pending results... 32935 1726853733.05895: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 32935 1726853733.05960: in run() - task 02083763-bbaf-84df-441d-0000000007f4 32935 1726853733.05977: variable 'ansible_search_path' from source: unknown 32935 1726853733.05980: variable 'ansible_search_path' from source: unknown 32935 1726853733.06027: calling self._execute() 32935 1726853733.06135: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853733.06140: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853733.06152: variable 'omit' from source: magic vars 32935 1726853733.06795: variable 'ansible_distribution_major_version' from source: facts 32935 1726853733.06801: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853733.06929: variable 'nm_profile_exists' from source: set_fact 32935 1726853733.06945: Evaluated conditional (nm_profile_exists.rc == 0): True 32935 1726853733.06951: variable 'omit' from source: magic vars 32935 1726853733.07312: variable 'omit' from source: magic vars 32935 1726853733.07477: variable 'omit' from source: magic vars 32935 1726853733.07480: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32935 1726853733.07538: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32935 1726853733.07561: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32935 1726853733.07634: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853733.07644: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853733.07679: variable 'inventory_hostname' from source: host vars for 'managed_node1' 32935 1726853733.07683: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853733.07685: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853733.07912: Set connection var ansible_timeout to 10 32935 1726853733.07926: Set connection var ansible_shell_type to sh 32935 1726853733.07929: Set connection var ansible_pipelining to False 32935 1726853733.07931: Set connection var ansible_connection to ssh 32935 1726853733.07933: Set connection var ansible_shell_executable to /bin/sh 32935 1726853733.07938: Set connection var ansible_module_compression to ZIP_DEFLATED 32935 1726853733.08088: variable 'ansible_shell_executable' from source: unknown 32935 1726853733.08091: variable 'ansible_connection' from source: unknown 32935 1726853733.08094: variable 'ansible_module_compression' from source: unknown 32935 1726853733.08096: variable 'ansible_shell_type' from source: unknown 32935 1726853733.08099: variable 'ansible_shell_executable' from source: unknown 32935 1726853733.08101: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853733.08106: variable 'ansible_pipelining' from source: unknown 32935 1726853733.08109: variable 'ansible_timeout' from source: unknown 32935 1726853733.08111: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853733.08370: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32935 1726853733.08489: variable 'omit' from source: magic vars 32935 1726853733.08501: starting attempt loop 32935 1726853733.08505: running the handler 32935 1726853733.08519: handler run complete 32935 1726853733.08577: attempt loop complete, returning result 32935 1726853733.08580: _execute() done 32935 1726853733.08582: dumping result to json 32935 1726853733.08585: done dumping result, returning 32935 1726853733.08588: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [02083763-bbaf-84df-441d-0000000007f4] 32935 1726853733.08590: sending task result for task 02083763-bbaf-84df-441d-0000000007f4 32935 1726853733.08660: done sending task result for task 02083763-bbaf-84df-441d-0000000007f4 ok: [managed_node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 32935 1726853733.08723: no more pending results, returning what we have 32935 1726853733.08727: results queue empty 32935 1726853733.08728: checking for any_errors_fatal 32935 1726853733.08740: done checking for any_errors_fatal 32935 1726853733.08741: checking for max_fail_percentage 32935 1726853733.08743: done checking for max_fail_percentage 32935 1726853733.08744: checking to see if all hosts have failed and the running result is not ok 32935 1726853733.08745: done checking to see if all hosts have failed 32935 1726853733.08746: getting the remaining hosts for this loop 32935 1726853733.08748: done getting the remaining hosts for this loop 32935 1726853733.08752: getting the next task for host managed_node1 32935 1726853733.08766: done getting next task for host managed_node1 32935 1726853733.08768: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 32935 1726853733.08775: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853733.08780: getting variables 32935 1726853733.08783: in VariableManager get_vars() 32935 1726853733.08943: Calling all_inventory to load vars for managed_node1 32935 1726853733.08946: Calling groups_inventory to load vars for managed_node1 32935 1726853733.08949: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853733.08962: Calling all_plugins_play to load vars for managed_node1 32935 1726853733.08965: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853733.08968: Calling groups_plugins_play to load vars for managed_node1 32935 1726853733.09500: WORKER PROCESS EXITING 32935 1726853733.10400: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853733.12249: done with get_vars() 32935 1726853733.12266: done getting variables 32935 1726853733.12312: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 32935 1726853733.12400: variable 'profile' from source: include params 32935 1726853733.12403: variable 'item' from source: include params 32935 1726853733.12446: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-lsr101] ************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 13:35:33 -0400 (0:00:00.073) 0:00:18.260 ****** 32935 1726853733.12477: entering _queue_task() for managed_node1/command 32935 1726853733.12721: worker is 1 (out of 1 available) 32935 1726853733.12735: exiting _queue_task() for managed_node1/command 32935 1726853733.12747: done queuing things up, now waiting for results queue to drain 32935 1726853733.12749: waiting for pending results... 32935 1726853733.12926: running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-lsr101 32935 1726853733.13007: in run() - task 02083763-bbaf-84df-441d-0000000007f6 32935 1726853733.13017: variable 'ansible_search_path' from source: unknown 32935 1726853733.13021: variable 'ansible_search_path' from source: unknown 32935 1726853733.13051: calling self._execute() 32935 1726853733.13126: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853733.13130: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853733.13138: variable 'omit' from source: magic vars 32935 1726853733.13428: variable 'ansible_distribution_major_version' from source: facts 32935 1726853733.13436: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853733.13676: variable 'profile_stat' from source: set_fact 32935 1726853733.13679: Evaluated conditional (profile_stat.stat.exists): False 32935 1726853733.13681: when evaluation is False, skipping this task 32935 1726853733.13682: _execute() done 32935 1726853733.13685: dumping result to json 32935 1726853733.13686: done dumping result, returning 32935 1726853733.13688: done running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-lsr101 [02083763-bbaf-84df-441d-0000000007f6] 32935 1726853733.13690: sending task result for task 02083763-bbaf-84df-441d-0000000007f6 32935 1726853733.13748: done sending task result for task 02083763-bbaf-84df-441d-0000000007f6 32935 1726853733.13750: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 32935 1726853733.13803: no more pending results, returning what we have 32935 1726853733.13807: results queue empty 32935 1726853733.13808: checking for any_errors_fatal 32935 1726853733.13818: done checking for any_errors_fatal 32935 1726853733.13818: checking for max_fail_percentage 32935 1726853733.13820: done checking for max_fail_percentage 32935 1726853733.13821: checking to see if all hosts have failed and the running result is not ok 32935 1726853733.13822: done checking to see if all hosts have failed 32935 1726853733.13823: getting the remaining hosts for this loop 32935 1726853733.13824: done getting the remaining hosts for this loop 32935 1726853733.13828: getting the next task for host managed_node1 32935 1726853733.13836: done getting next task for host managed_node1 32935 1726853733.13839: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 32935 1726853733.13842: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853733.13846: getting variables 32935 1726853733.13847: in VariableManager get_vars() 32935 1726853733.13976: Calling all_inventory to load vars for managed_node1 32935 1726853733.13983: Calling groups_inventory to load vars for managed_node1 32935 1726853733.13986: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853733.13995: Calling all_plugins_play to load vars for managed_node1 32935 1726853733.13998: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853733.14001: Calling groups_plugins_play to load vars for managed_node1 32935 1726853733.16517: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853733.18252: done with get_vars() 32935 1726853733.18285: done getting variables 32935 1726853733.18344: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 32935 1726853733.18519: variable 'profile' from source: include params 32935 1726853733.18523: variable 'item' from source: include params 32935 1726853733.18641: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-lsr101] ********************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 13:35:33 -0400 (0:00:00.061) 0:00:18.322 ****** 32935 1726853733.18682: entering _queue_task() for managed_node1/set_fact 32935 1726853733.19039: worker is 1 (out of 1 available) 32935 1726853733.19051: exiting _queue_task() for managed_node1/set_fact 32935 1726853733.19066: done queuing things up, now waiting for results queue to drain 32935 1726853733.19068: waiting for pending results... 32935 1726853733.19526: running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-lsr101 32935 1726853733.19532: in run() - task 02083763-bbaf-84df-441d-0000000007f7 32935 1726853733.19535: variable 'ansible_search_path' from source: unknown 32935 1726853733.19538: variable 'ansible_search_path' from source: unknown 32935 1726853733.19540: calling self._execute() 32935 1726853733.19840: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853733.19843: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853733.19846: variable 'omit' from source: magic vars 32935 1726853733.20040: variable 'ansible_distribution_major_version' from source: facts 32935 1726853733.20044: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853733.20179: variable 'profile_stat' from source: set_fact 32935 1726853733.20196: Evaluated conditional (profile_stat.stat.exists): False 32935 1726853733.20199: when evaluation is False, skipping this task 32935 1726853733.20202: _execute() done 32935 1726853733.20205: dumping result to json 32935 1726853733.20207: done dumping result, returning 32935 1726853733.20211: done running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-lsr101 [02083763-bbaf-84df-441d-0000000007f7] 32935 1726853733.20221: sending task result for task 02083763-bbaf-84df-441d-0000000007f7 skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 32935 1726853733.20380: no more pending results, returning what we have 32935 1726853733.20384: results queue empty 32935 1726853733.20386: checking for any_errors_fatal 32935 1726853733.20395: done checking for any_errors_fatal 32935 1726853733.20396: checking for max_fail_percentage 32935 1726853733.20399: done checking for max_fail_percentage 32935 1726853733.20400: checking to see if all hosts have failed and the running result is not ok 32935 1726853733.20401: done checking to see if all hosts have failed 32935 1726853733.20402: getting the remaining hosts for this loop 32935 1726853733.20403: done getting the remaining hosts for this loop 32935 1726853733.20407: getting the next task for host managed_node1 32935 1726853733.20415: done getting next task for host managed_node1 32935 1726853733.20418: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 32935 1726853733.20422: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853733.20428: getting variables 32935 1726853733.20430: in VariableManager get_vars() 32935 1726853733.20478: Calling all_inventory to load vars for managed_node1 32935 1726853733.20481: Calling groups_inventory to load vars for managed_node1 32935 1726853733.20484: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853733.20499: Calling all_plugins_play to load vars for managed_node1 32935 1726853733.20502: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853733.20506: Calling groups_plugins_play to load vars for managed_node1 32935 1726853733.21279: done sending task result for task 02083763-bbaf-84df-441d-0000000007f7 32935 1726853733.21283: WORKER PROCESS EXITING 32935 1726853733.22946: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853733.24836: done with get_vars() 32935 1726853733.24863: done getting variables 32935 1726853733.25026: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 32935 1726853733.25345: variable 'profile' from source: include params 32935 1726853733.25349: variable 'item' from source: include params 32935 1726853733.25411: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-lsr101] ***************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 13:35:33 -0400 (0:00:00.067) 0:00:18.390 ****** 32935 1726853733.25442: entering _queue_task() for managed_node1/command 32935 1726853733.25820: worker is 1 (out of 1 available) 32935 1726853733.25833: exiting _queue_task() for managed_node1/command 32935 1726853733.25847: done queuing things up, now waiting for results queue to drain 32935 1726853733.25849: waiting for pending results... 32935 1726853733.26093: running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-lsr101 32935 1726853733.26223: in run() - task 02083763-bbaf-84df-441d-0000000007f8 32935 1726853733.26249: variable 'ansible_search_path' from source: unknown 32935 1726853733.26258: variable 'ansible_search_path' from source: unknown 32935 1726853733.26299: calling self._execute() 32935 1726853733.26398: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853733.26408: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853733.26420: variable 'omit' from source: magic vars 32935 1726853733.26803: variable 'ansible_distribution_major_version' from source: facts 32935 1726853733.26876: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853733.26945: variable 'profile_stat' from source: set_fact 32935 1726853733.26973: Evaluated conditional (profile_stat.stat.exists): False 32935 1726853733.26977: when evaluation is False, skipping this task 32935 1726853733.26980: _execute() done 32935 1726853733.26982: dumping result to json 32935 1726853733.26985: done dumping result, returning 32935 1726853733.26992: done running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-lsr101 [02083763-bbaf-84df-441d-0000000007f8] 32935 1726853733.26998: sending task result for task 02083763-bbaf-84df-441d-0000000007f8 skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 32935 1726853733.27137: no more pending results, returning what we have 32935 1726853733.27142: results queue empty 32935 1726853733.27143: checking for any_errors_fatal 32935 1726853733.27153: done checking for any_errors_fatal 32935 1726853733.27154: checking for max_fail_percentage 32935 1726853733.27156: done checking for max_fail_percentage 32935 1726853733.27157: checking to see if all hosts have failed and the running result is not ok 32935 1726853733.27161: done checking to see if all hosts have failed 32935 1726853733.27374: getting the remaining hosts for this loop 32935 1726853733.27376: done getting the remaining hosts for this loop 32935 1726853733.27380: getting the next task for host managed_node1 32935 1726853733.27387: done getting next task for host managed_node1 32935 1726853733.27389: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 32935 1726853733.27393: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853733.27397: getting variables 32935 1726853733.27398: in VariableManager get_vars() 32935 1726853733.27435: Calling all_inventory to load vars for managed_node1 32935 1726853733.27438: Calling groups_inventory to load vars for managed_node1 32935 1726853733.27441: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853733.27447: done sending task result for task 02083763-bbaf-84df-441d-0000000007f8 32935 1726853733.27450: WORKER PROCESS EXITING 32935 1726853733.27463: Calling all_plugins_play to load vars for managed_node1 32935 1726853733.27467: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853733.27472: Calling groups_plugins_play to load vars for managed_node1 32935 1726853733.28884: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853733.30429: done with get_vars() 32935 1726853733.30451: done getting variables 32935 1726853733.30511: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 32935 1726853733.30617: variable 'profile' from source: include params 32935 1726853733.30620: variable 'item' from source: include params 32935 1726853733.30680: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-lsr101] ************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 13:35:33 -0400 (0:00:00.052) 0:00:18.442 ****** 32935 1726853733.30711: entering _queue_task() for managed_node1/set_fact 32935 1726853733.31099: worker is 1 (out of 1 available) 32935 1726853733.31110: exiting _queue_task() for managed_node1/set_fact 32935 1726853733.31120: done queuing things up, now waiting for results queue to drain 32935 1726853733.31121: waiting for pending results... 32935 1726853733.31306: running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-lsr101 32935 1726853733.31456: in run() - task 02083763-bbaf-84df-441d-0000000007f9 32935 1726853733.31460: variable 'ansible_search_path' from source: unknown 32935 1726853733.31463: variable 'ansible_search_path' from source: unknown 32935 1726853733.31469: calling self._execute() 32935 1726853733.31554: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853733.31677: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853733.31682: variable 'omit' from source: magic vars 32935 1726853733.31933: variable 'ansible_distribution_major_version' from source: facts 32935 1726853733.31944: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853733.32067: variable 'profile_stat' from source: set_fact 32935 1726853733.32082: Evaluated conditional (profile_stat.stat.exists): False 32935 1726853733.32086: when evaluation is False, skipping this task 32935 1726853733.32089: _execute() done 32935 1726853733.32091: dumping result to json 32935 1726853733.32094: done dumping result, returning 32935 1726853733.32103: done running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-lsr101 [02083763-bbaf-84df-441d-0000000007f9] 32935 1726853733.32105: sending task result for task 02083763-bbaf-84df-441d-0000000007f9 32935 1726853733.32194: done sending task result for task 02083763-bbaf-84df-441d-0000000007f9 32935 1726853733.32197: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 32935 1726853733.32277: no more pending results, returning what we have 32935 1726853733.32280: results queue empty 32935 1726853733.32282: checking for any_errors_fatal 32935 1726853733.32289: done checking for any_errors_fatal 32935 1726853733.32290: checking for max_fail_percentage 32935 1726853733.32292: done checking for max_fail_percentage 32935 1726853733.32293: checking to see if all hosts have failed and the running result is not ok 32935 1726853733.32294: done checking to see if all hosts have failed 32935 1726853733.32295: getting the remaining hosts for this loop 32935 1726853733.32296: done getting the remaining hosts for this loop 32935 1726853733.32300: getting the next task for host managed_node1 32935 1726853733.32310: done getting next task for host managed_node1 32935 1726853733.32313: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 32935 1726853733.32316: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853733.32322: getting variables 32935 1726853733.32324: in VariableManager get_vars() 32935 1726853733.32367: Calling all_inventory to load vars for managed_node1 32935 1726853733.32370: Calling groups_inventory to load vars for managed_node1 32935 1726853733.32374: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853733.32387: Calling all_plugins_play to load vars for managed_node1 32935 1726853733.32390: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853733.32393: Calling groups_plugins_play to load vars for managed_node1 32935 1726853733.34065: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853733.35573: done with get_vars() 32935 1726853733.35597: done getting variables 32935 1726853733.35655: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 32935 1726853733.35776: variable 'profile' from source: include params 32935 1726853733.35780: variable 'item' from source: include params 32935 1726853733.35842: variable 'item' from source: include params TASK [Assert that the profile is present - 'lsr101'] *************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 13:35:33 -0400 (0:00:00.051) 0:00:18.494 ****** 32935 1726853733.35879: entering _queue_task() for managed_node1/assert 32935 1726853733.36213: worker is 1 (out of 1 available) 32935 1726853733.36225: exiting _queue_task() for managed_node1/assert 32935 1726853733.36238: done queuing things up, now waiting for results queue to drain 32935 1726853733.36239: waiting for pending results... 32935 1726853733.36523: running TaskExecutor() for managed_node1/TASK: Assert that the profile is present - 'lsr101' 32935 1726853733.36619: in run() - task 02083763-bbaf-84df-441d-0000000006b9 32935 1726853733.36677: variable 'ansible_search_path' from source: unknown 32935 1726853733.36681: variable 'ansible_search_path' from source: unknown 32935 1726853733.36684: calling self._execute() 32935 1726853733.36767: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853733.36774: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853733.36785: variable 'omit' from source: magic vars 32935 1726853733.37153: variable 'ansible_distribution_major_version' from source: facts 32935 1726853733.37165: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853733.37262: variable 'omit' from source: magic vars 32935 1726853733.37265: variable 'omit' from source: magic vars 32935 1726853733.37310: variable 'profile' from source: include params 32935 1726853733.37313: variable 'item' from source: include params 32935 1726853733.37382: variable 'item' from source: include params 32935 1726853733.37400: variable 'omit' from source: magic vars 32935 1726853733.37440: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32935 1726853733.37483: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32935 1726853733.37504: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32935 1726853733.37520: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853733.37531: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853733.37566: variable 'inventory_hostname' from source: host vars for 'managed_node1' 32935 1726853733.37569: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853733.37575: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853733.37676: Set connection var ansible_timeout to 10 32935 1726853733.37683: Set connection var ansible_shell_type to sh 32935 1726853733.37693: Set connection var ansible_pipelining to False 32935 1726853733.37696: Set connection var ansible_connection to ssh 32935 1726853733.37698: Set connection var ansible_shell_executable to /bin/sh 32935 1726853733.37703: Set connection var ansible_module_compression to ZIP_DEFLATED 32935 1726853733.37728: variable 'ansible_shell_executable' from source: unknown 32935 1726853733.37731: variable 'ansible_connection' from source: unknown 32935 1726853733.37734: variable 'ansible_module_compression' from source: unknown 32935 1726853733.37736: variable 'ansible_shell_type' from source: unknown 32935 1726853733.37738: variable 'ansible_shell_executable' from source: unknown 32935 1726853733.37741: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853733.37803: variable 'ansible_pipelining' from source: unknown 32935 1726853733.37806: variable 'ansible_timeout' from source: unknown 32935 1726853733.37809: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853733.37894: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32935 1726853733.37906: variable 'omit' from source: magic vars 32935 1726853733.37909: starting attempt loop 32935 1726853733.37912: running the handler 32935 1726853733.38027: variable 'lsr_net_profile_exists' from source: set_fact 32935 1726853733.38031: Evaluated conditional (lsr_net_profile_exists): True 32935 1726853733.38037: handler run complete 32935 1726853733.38050: attempt loop complete, returning result 32935 1726853733.38053: _execute() done 32935 1726853733.38056: dumping result to json 32935 1726853733.38059: done dumping result, returning 32935 1726853733.38136: done running TaskExecutor() for managed_node1/TASK: Assert that the profile is present - 'lsr101' [02083763-bbaf-84df-441d-0000000006b9] 32935 1726853733.38138: sending task result for task 02083763-bbaf-84df-441d-0000000006b9 32935 1726853733.38197: done sending task result for task 02083763-bbaf-84df-441d-0000000006b9 32935 1726853733.38200: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 32935 1726853733.38254: no more pending results, returning what we have 32935 1726853733.38261: results queue empty 32935 1726853733.38262: checking for any_errors_fatal 32935 1726853733.38269: done checking for any_errors_fatal 32935 1726853733.38272: checking for max_fail_percentage 32935 1726853733.38274: done checking for max_fail_percentage 32935 1726853733.38275: checking to see if all hosts have failed and the running result is not ok 32935 1726853733.38277: done checking to see if all hosts have failed 32935 1726853733.38277: getting the remaining hosts for this loop 32935 1726853733.38279: done getting the remaining hosts for this loop 32935 1726853733.38282: getting the next task for host managed_node1 32935 1726853733.38290: done getting next task for host managed_node1 32935 1726853733.38292: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 32935 1726853733.38295: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853733.38300: getting variables 32935 1726853733.38302: in VariableManager get_vars() 32935 1726853733.38345: Calling all_inventory to load vars for managed_node1 32935 1726853733.38347: Calling groups_inventory to load vars for managed_node1 32935 1726853733.38351: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853733.38365: Calling all_plugins_play to load vars for managed_node1 32935 1726853733.38369: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853733.38578: Calling groups_plugins_play to load vars for managed_node1 32935 1726853733.39997: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853733.41935: done with get_vars() 32935 1726853733.41957: done getting variables 32935 1726853733.42019: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 32935 1726853733.42137: variable 'profile' from source: include params 32935 1726853733.42141: variable 'item' from source: include params 32935 1726853733.42204: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'lsr101'] ********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 13:35:33 -0400 (0:00:00.063) 0:00:18.557 ****** 32935 1726853733.42239: entering _queue_task() for managed_node1/assert 32935 1726853733.42796: worker is 1 (out of 1 available) 32935 1726853733.42803: exiting _queue_task() for managed_node1/assert 32935 1726853733.42813: done queuing things up, now waiting for results queue to drain 32935 1726853733.42815: waiting for pending results... 32935 1726853733.42945: running TaskExecutor() for managed_node1/TASK: Assert that the ansible managed comment is present in 'lsr101' 32935 1726853733.42966: in run() - task 02083763-bbaf-84df-441d-0000000006ba 32935 1726853733.42988: variable 'ansible_search_path' from source: unknown 32935 1726853733.42996: variable 'ansible_search_path' from source: unknown 32935 1726853733.43043: calling self._execute() 32935 1726853733.43137: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853733.43155: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853733.43173: variable 'omit' from source: magic vars 32935 1726853733.43525: variable 'ansible_distribution_major_version' from source: facts 32935 1726853733.43541: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853733.43551: variable 'omit' from source: magic vars 32935 1726853733.43600: variable 'omit' from source: magic vars 32935 1726853733.43715: variable 'profile' from source: include params 32935 1726853733.43806: variable 'item' from source: include params 32935 1726853733.43809: variable 'item' from source: include params 32935 1726853733.43824: variable 'omit' from source: magic vars 32935 1726853733.43873: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32935 1726853733.43922: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32935 1726853733.43949: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32935 1726853733.43975: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853733.43992: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853733.44034: variable 'inventory_hostname' from source: host vars for 'managed_node1' 32935 1726853733.44043: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853733.44051: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853733.44161: Set connection var ansible_timeout to 10 32935 1726853733.44176: Set connection var ansible_shell_type to sh 32935 1726853733.44189: Set connection var ansible_pipelining to False 32935 1726853733.44196: Set connection var ansible_connection to ssh 32935 1726853733.44242: Set connection var ansible_shell_executable to /bin/sh 32935 1726853733.44246: Set connection var ansible_module_compression to ZIP_DEFLATED 32935 1726853733.44249: variable 'ansible_shell_executable' from source: unknown 32935 1726853733.44257: variable 'ansible_connection' from source: unknown 32935 1726853733.44264: variable 'ansible_module_compression' from source: unknown 32935 1726853733.44272: variable 'ansible_shell_type' from source: unknown 32935 1726853733.44279: variable 'ansible_shell_executable' from source: unknown 32935 1726853733.44286: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853733.44294: variable 'ansible_pipelining' from source: unknown 32935 1726853733.44351: variable 'ansible_timeout' from source: unknown 32935 1726853733.44354: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853733.44463: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32935 1726853733.44483: variable 'omit' from source: magic vars 32935 1726853733.44494: starting attempt loop 32935 1726853733.44501: running the handler 32935 1726853733.44616: variable 'lsr_net_profile_ansible_managed' from source: set_fact 32935 1726853733.44627: Evaluated conditional (lsr_net_profile_ansible_managed): True 32935 1726853733.44638: handler run complete 32935 1726853733.44680: attempt loop complete, returning result 32935 1726853733.44683: _execute() done 32935 1726853733.44685: dumping result to json 32935 1726853733.44688: done dumping result, returning 32935 1726853733.44690: done running TaskExecutor() for managed_node1/TASK: Assert that the ansible managed comment is present in 'lsr101' [02083763-bbaf-84df-441d-0000000006ba] 32935 1726853733.44701: sending task result for task 02083763-bbaf-84df-441d-0000000006ba 32935 1726853733.44851: done sending task result for task 02083763-bbaf-84df-441d-0000000006ba 32935 1726853733.44854: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 32935 1726853733.44943: no more pending results, returning what we have 32935 1726853733.44947: results queue empty 32935 1726853733.44948: checking for any_errors_fatal 32935 1726853733.44956: done checking for any_errors_fatal 32935 1726853733.44956: checking for max_fail_percentage 32935 1726853733.44958: done checking for max_fail_percentage 32935 1726853733.44959: checking to see if all hosts have failed and the running result is not ok 32935 1726853733.44960: done checking to see if all hosts have failed 32935 1726853733.44961: getting the remaining hosts for this loop 32935 1726853733.44963: done getting the remaining hosts for this loop 32935 1726853733.44966: getting the next task for host managed_node1 32935 1726853733.44977: done getting next task for host managed_node1 32935 1726853733.44979: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 32935 1726853733.44982: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853733.44987: getting variables 32935 1726853733.44989: in VariableManager get_vars() 32935 1726853733.45038: Calling all_inventory to load vars for managed_node1 32935 1726853733.45041: Calling groups_inventory to load vars for managed_node1 32935 1726853733.45044: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853733.45058: Calling all_plugins_play to load vars for managed_node1 32935 1726853733.45061: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853733.45064: Calling groups_plugins_play to load vars for managed_node1 32935 1726853733.46659: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853733.48215: done with get_vars() 32935 1726853733.48242: done getting variables 32935 1726853733.48308: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 32935 1726853733.48420: variable 'profile' from source: include params 32935 1726853733.48424: variable 'item' from source: include params 32935 1726853733.48486: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in lsr101] **************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 13:35:33 -0400 (0:00:00.062) 0:00:18.620 ****** 32935 1726853733.48523: entering _queue_task() for managed_node1/assert 32935 1726853733.48867: worker is 1 (out of 1 available) 32935 1726853733.49079: exiting _queue_task() for managed_node1/assert 32935 1726853733.49090: done queuing things up, now waiting for results queue to drain 32935 1726853733.49092: waiting for pending results... 32935 1726853733.49222: running TaskExecutor() for managed_node1/TASK: Assert that the fingerprint comment is present in lsr101 32935 1726853733.49293: in run() - task 02083763-bbaf-84df-441d-0000000006bb 32935 1726853733.49320: variable 'ansible_search_path' from source: unknown 32935 1726853733.49429: variable 'ansible_search_path' from source: unknown 32935 1726853733.49433: calling self._execute() 32935 1726853733.49470: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853733.49484: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853733.49499: variable 'omit' from source: magic vars 32935 1726853733.49883: variable 'ansible_distribution_major_version' from source: facts 32935 1726853733.49900: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853733.49912: variable 'omit' from source: magic vars 32935 1726853733.49952: variable 'omit' from source: magic vars 32935 1726853733.50060: variable 'profile' from source: include params 32935 1726853733.50076: variable 'item' from source: include params 32935 1726853733.50143: variable 'item' from source: include params 32935 1726853733.50169: variable 'omit' from source: magic vars 32935 1726853733.50224: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32935 1726853733.50265: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32935 1726853733.50296: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32935 1726853733.50315: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853733.50376: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853733.50379: variable 'inventory_hostname' from source: host vars for 'managed_node1' 32935 1726853733.50381: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853733.50382: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853733.50462: Set connection var ansible_timeout to 10 32935 1726853733.50475: Set connection var ansible_shell_type to sh 32935 1726853733.50487: Set connection var ansible_pipelining to False 32935 1726853733.50495: Set connection var ansible_connection to ssh 32935 1726853733.50509: Set connection var ansible_shell_executable to /bin/sh 32935 1726853733.50523: Set connection var ansible_module_compression to ZIP_DEFLATED 32935 1726853733.50551: variable 'ansible_shell_executable' from source: unknown 32935 1726853733.50558: variable 'ansible_connection' from source: unknown 32935 1726853733.50623: variable 'ansible_module_compression' from source: unknown 32935 1726853733.50626: variable 'ansible_shell_type' from source: unknown 32935 1726853733.50628: variable 'ansible_shell_executable' from source: unknown 32935 1726853733.50630: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853733.50632: variable 'ansible_pipelining' from source: unknown 32935 1726853733.50634: variable 'ansible_timeout' from source: unknown 32935 1726853733.50637: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853733.50735: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32935 1726853733.50751: variable 'omit' from source: magic vars 32935 1726853733.50759: starting attempt loop 32935 1726853733.50765: running the handler 32935 1726853733.50878: variable 'lsr_net_profile_fingerprint' from source: set_fact 32935 1726853733.50889: Evaluated conditional (lsr_net_profile_fingerprint): True 32935 1726853733.50898: handler run complete 32935 1726853733.50915: attempt loop complete, returning result 32935 1726853733.50948: _execute() done 32935 1726853733.50952: dumping result to json 32935 1726853733.50955: done dumping result, returning 32935 1726853733.50957: done running TaskExecutor() for managed_node1/TASK: Assert that the fingerprint comment is present in lsr101 [02083763-bbaf-84df-441d-0000000006bb] 32935 1726853733.50959: sending task result for task 02083763-bbaf-84df-441d-0000000006bb ok: [managed_node1] => { "changed": false } MSG: All assertions passed 32935 1726853733.51103: no more pending results, returning what we have 32935 1726853733.51107: results queue empty 32935 1726853733.51109: checking for any_errors_fatal 32935 1726853733.51115: done checking for any_errors_fatal 32935 1726853733.51117: checking for max_fail_percentage 32935 1726853733.51118: done checking for max_fail_percentage 32935 1726853733.51119: checking to see if all hosts have failed and the running result is not ok 32935 1726853733.51121: done checking to see if all hosts have failed 32935 1726853733.51122: getting the remaining hosts for this loop 32935 1726853733.51123: done getting the remaining hosts for this loop 32935 1726853733.51126: getting the next task for host managed_node1 32935 1726853733.51137: done getting next task for host managed_node1 32935 1726853733.51140: ^ task is: TASK: Include the task 'get_profile_stat.yml' 32935 1726853733.51144: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853733.51148: getting variables 32935 1726853733.51150: in VariableManager get_vars() 32935 1726853733.51199: Calling all_inventory to load vars for managed_node1 32935 1726853733.51203: Calling groups_inventory to load vars for managed_node1 32935 1726853733.51206: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853733.51219: Calling all_plugins_play to load vars for managed_node1 32935 1726853733.51222: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853733.51225: Calling groups_plugins_play to load vars for managed_node1 32935 1726853733.51986: done sending task result for task 02083763-bbaf-84df-441d-0000000006bb 32935 1726853733.51990: WORKER PROCESS EXITING 32935 1726853733.53060: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853733.59004: done with get_vars() 32935 1726853733.59032: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 13:35:33 -0400 (0:00:00.105) 0:00:18.726 ****** 32935 1726853733.59119: entering _queue_task() for managed_node1/include_tasks 32935 1726853733.59483: worker is 1 (out of 1 available) 32935 1726853733.59495: exiting _queue_task() for managed_node1/include_tasks 32935 1726853733.59508: done queuing things up, now waiting for results queue to drain 32935 1726853733.59510: waiting for pending results... 32935 1726853733.59795: running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' 32935 1726853733.59924: in run() - task 02083763-bbaf-84df-441d-0000000006bf 32935 1726853733.59943: variable 'ansible_search_path' from source: unknown 32935 1726853733.59981: variable 'ansible_search_path' from source: unknown 32935 1726853733.60002: calling self._execute() 32935 1726853733.60107: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853733.60119: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853733.60140: variable 'omit' from source: magic vars 32935 1726853733.60568: variable 'ansible_distribution_major_version' from source: facts 32935 1726853733.60572: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853733.60576: _execute() done 32935 1726853733.60579: dumping result to json 32935 1726853733.60582: done dumping result, returning 32935 1726853733.60585: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' [02083763-bbaf-84df-441d-0000000006bf] 32935 1726853733.60590: sending task result for task 02083763-bbaf-84df-441d-0000000006bf 32935 1726853733.60721: no more pending results, returning what we have 32935 1726853733.60727: in VariableManager get_vars() 32935 1726853733.60811: Calling all_inventory to load vars for managed_node1 32935 1726853733.60815: Calling groups_inventory to load vars for managed_node1 32935 1726853733.60817: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853733.60832: Calling all_plugins_play to load vars for managed_node1 32935 1726853733.60835: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853733.60838: Calling groups_plugins_play to load vars for managed_node1 32935 1726853733.61589: done sending task result for task 02083763-bbaf-84df-441d-0000000006bf 32935 1726853733.61592: WORKER PROCESS EXITING 32935 1726853733.62442: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853733.64083: done with get_vars() 32935 1726853733.64107: variable 'ansible_search_path' from source: unknown 32935 1726853733.64108: variable 'ansible_search_path' from source: unknown 32935 1726853733.64146: we have included files to process 32935 1726853733.64148: generating all_blocks data 32935 1726853733.64150: done generating all_blocks data 32935 1726853733.64155: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 32935 1726853733.64156: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 32935 1726853733.64159: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 32935 1726853733.65082: done processing included file 32935 1726853733.65084: iterating over new_blocks loaded from include file 32935 1726853733.65086: in VariableManager get_vars() 32935 1726853733.65107: done with get_vars() 32935 1726853733.65109: filtering new block on tags 32935 1726853733.65133: done filtering new block on tags 32935 1726853733.65137: in VariableManager get_vars() 32935 1726853733.65156: done with get_vars() 32935 1726853733.65158: filtering new block on tags 32935 1726853733.65186: done filtering new block on tags 32935 1726853733.65188: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node1 32935 1726853733.65194: extending task lists for all hosts with included blocks 32935 1726853733.65377: done extending task lists 32935 1726853733.65379: done processing included files 32935 1726853733.65379: results queue empty 32935 1726853733.65380: checking for any_errors_fatal 32935 1726853733.65384: done checking for any_errors_fatal 32935 1726853733.65385: checking for max_fail_percentage 32935 1726853733.65386: done checking for max_fail_percentage 32935 1726853733.65387: checking to see if all hosts have failed and the running result is not ok 32935 1726853733.65388: done checking to see if all hosts have failed 32935 1726853733.65389: getting the remaining hosts for this loop 32935 1726853733.65390: done getting the remaining hosts for this loop 32935 1726853733.65393: getting the next task for host managed_node1 32935 1726853733.65397: done getting next task for host managed_node1 32935 1726853733.65404: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 32935 1726853733.65407: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853733.65410: getting variables 32935 1726853733.65411: in VariableManager get_vars() 32935 1726853733.65424: Calling all_inventory to load vars for managed_node1 32935 1726853733.65426: Calling groups_inventory to load vars for managed_node1 32935 1726853733.65428: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853733.65435: Calling all_plugins_play to load vars for managed_node1 32935 1726853733.65438: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853733.65441: Calling groups_plugins_play to load vars for managed_node1 32935 1726853733.66685: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853733.68239: done with get_vars() 32935 1726853733.68259: done getting variables 32935 1726853733.68301: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 13:35:33 -0400 (0:00:00.092) 0:00:18.818 ****** 32935 1726853733.68336: entering _queue_task() for managed_node1/set_fact 32935 1726853733.68697: worker is 1 (out of 1 available) 32935 1726853733.68709: exiting _queue_task() for managed_node1/set_fact 32935 1726853733.68723: done queuing things up, now waiting for results queue to drain 32935 1726853733.68725: waiting for pending results... 32935 1726853733.69014: running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag 32935 1726853733.69146: in run() - task 02083763-bbaf-84df-441d-000000000838 32935 1726853733.69164: variable 'ansible_search_path' from source: unknown 32935 1726853733.69170: variable 'ansible_search_path' from source: unknown 32935 1726853733.69223: calling self._execute() 32935 1726853733.69325: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853733.69334: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853733.69346: variable 'omit' from source: magic vars 32935 1726853733.69719: variable 'ansible_distribution_major_version' from source: facts 32935 1726853733.69742: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853733.69753: variable 'omit' from source: magic vars 32935 1726853733.69800: variable 'omit' from source: magic vars 32935 1726853733.69839: variable 'omit' from source: magic vars 32935 1726853733.69958: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32935 1726853733.69962: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32935 1726853733.69964: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32935 1726853733.70065: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853733.70069: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853733.70073: variable 'inventory_hostname' from source: host vars for 'managed_node1' 32935 1726853733.70079: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853733.70082: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853733.70150: Set connection var ansible_timeout to 10 32935 1726853733.70162: Set connection var ansible_shell_type to sh 32935 1726853733.70183: Set connection var ansible_pipelining to False 32935 1726853733.70194: Set connection var ansible_connection to ssh 32935 1726853733.70204: Set connection var ansible_shell_executable to /bin/sh 32935 1726853733.70214: Set connection var ansible_module_compression to ZIP_DEFLATED 32935 1726853733.70243: variable 'ansible_shell_executable' from source: unknown 32935 1726853733.70252: variable 'ansible_connection' from source: unknown 32935 1726853733.70259: variable 'ansible_module_compression' from source: unknown 32935 1726853733.70267: variable 'ansible_shell_type' from source: unknown 32935 1726853733.70278: variable 'ansible_shell_executable' from source: unknown 32935 1726853733.70285: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853733.70295: variable 'ansible_pipelining' from source: unknown 32935 1726853733.70383: variable 'ansible_timeout' from source: unknown 32935 1726853733.70386: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853733.70454: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32935 1726853733.70472: variable 'omit' from source: magic vars 32935 1726853733.70491: starting attempt loop 32935 1726853733.70499: running the handler 32935 1726853733.70514: handler run complete 32935 1726853733.70527: attempt loop complete, returning result 32935 1726853733.70533: _execute() done 32935 1726853733.70539: dumping result to json 32935 1726853733.70546: done dumping result, returning 32935 1726853733.70555: done running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag [02083763-bbaf-84df-441d-000000000838] 32935 1726853733.70564: sending task result for task 02083763-bbaf-84df-441d-000000000838 ok: [managed_node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 32935 1726853733.70756: no more pending results, returning what we have 32935 1726853733.70759: results queue empty 32935 1726853733.70760: checking for any_errors_fatal 32935 1726853733.70762: done checking for any_errors_fatal 32935 1726853733.70762: checking for max_fail_percentage 32935 1726853733.70765: done checking for max_fail_percentage 32935 1726853733.70765: checking to see if all hosts have failed and the running result is not ok 32935 1726853733.70766: done checking to see if all hosts have failed 32935 1726853733.70767: getting the remaining hosts for this loop 32935 1726853733.70769: done getting the remaining hosts for this loop 32935 1726853733.70774: getting the next task for host managed_node1 32935 1726853733.70782: done getting next task for host managed_node1 32935 1726853733.70784: ^ task is: TASK: Stat profile file 32935 1726853733.70788: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853733.70792: getting variables 32935 1726853733.70795: in VariableManager get_vars() 32935 1726853733.70837: Calling all_inventory to load vars for managed_node1 32935 1726853733.70840: Calling groups_inventory to load vars for managed_node1 32935 1726853733.70843: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853733.70854: Calling all_plugins_play to load vars for managed_node1 32935 1726853733.70858: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853733.70861: Calling groups_plugins_play to load vars for managed_node1 32935 1726853733.71496: done sending task result for task 02083763-bbaf-84df-441d-000000000838 32935 1726853733.71500: WORKER PROCESS EXITING 32935 1726853733.72481: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853733.74075: done with get_vars() 32935 1726853733.74099: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 13:35:33 -0400 (0:00:00.058) 0:00:18.877 ****** 32935 1726853733.74196: entering _queue_task() for managed_node1/stat 32935 1726853733.74540: worker is 1 (out of 1 available) 32935 1726853733.74553: exiting _queue_task() for managed_node1/stat 32935 1726853733.74574: done queuing things up, now waiting for results queue to drain 32935 1726853733.74576: waiting for pending results... 32935 1726853733.74870: running TaskExecutor() for managed_node1/TASK: Stat profile file 32935 1726853733.74992: in run() - task 02083763-bbaf-84df-441d-000000000839 32935 1726853733.75003: variable 'ansible_search_path' from source: unknown 32935 1726853733.75006: variable 'ansible_search_path' from source: unknown 32935 1726853733.75050: calling self._execute() 32935 1726853733.75155: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853733.75158: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853733.75175: variable 'omit' from source: magic vars 32935 1726853733.75583: variable 'ansible_distribution_major_version' from source: facts 32935 1726853733.75768: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853733.75777: variable 'omit' from source: magic vars 32935 1726853733.75818: variable 'omit' from source: magic vars 32935 1726853733.75915: variable 'profile' from source: include params 32935 1726853733.75919: variable 'item' from source: include params 32935 1726853733.76254: variable 'item' from source: include params 32935 1726853733.76278: variable 'omit' from source: magic vars 32935 1726853733.76320: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32935 1726853733.76474: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32935 1726853733.76601: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32935 1726853733.76618: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853733.76631: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853733.76664: variable 'inventory_hostname' from source: host vars for 'managed_node1' 32935 1726853733.76668: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853733.76670: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853733.76882: Set connection var ansible_timeout to 10 32935 1726853733.76889: Set connection var ansible_shell_type to sh 32935 1726853733.76968: Set connection var ansible_pipelining to False 32935 1726853733.76973: Set connection var ansible_connection to ssh 32935 1726853733.76986: Set connection var ansible_shell_executable to /bin/sh 32935 1726853733.76992: Set connection var ansible_module_compression to ZIP_DEFLATED 32935 1726853733.77086: variable 'ansible_shell_executable' from source: unknown 32935 1726853733.77090: variable 'ansible_connection' from source: unknown 32935 1726853733.77093: variable 'ansible_module_compression' from source: unknown 32935 1726853733.77095: variable 'ansible_shell_type' from source: unknown 32935 1726853733.77098: variable 'ansible_shell_executable' from source: unknown 32935 1726853733.77100: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853733.77102: variable 'ansible_pipelining' from source: unknown 32935 1726853733.77105: variable 'ansible_timeout' from source: unknown 32935 1726853733.77107: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853733.77622: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 32935 1726853733.77631: variable 'omit' from source: magic vars 32935 1726853733.77634: starting attempt loop 32935 1726853733.77636: running the handler 32935 1726853733.77639: _low_level_execute_command(): starting 32935 1726853733.77641: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32935 1726853733.79510: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32935 1726853733.79762: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853733.79989: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853733.80067: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853733.81753: stdout chunk (state=3): >>>/root <<< 32935 1726853733.81888: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853733.81903: stderr chunk (state=3): >>><<< 32935 1726853733.81910: stdout chunk (state=3): >>><<< 32935 1726853733.81941: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853733.81956: _low_level_execute_command(): starting 32935 1726853733.81967: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853733.819418-33836-15495209422968 `" && echo ansible-tmp-1726853733.819418-33836-15495209422968="` echo /root/.ansible/tmp/ansible-tmp-1726853733.819418-33836-15495209422968 `" ) && sleep 0' 32935 1726853733.82774: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853733.82781: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853733.82805: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853733.82876: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853733.84746: stdout chunk (state=3): >>>ansible-tmp-1726853733.819418-33836-15495209422968=/root/.ansible/tmp/ansible-tmp-1726853733.819418-33836-15495209422968 <<< 32935 1726853733.84922: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853733.84925: stdout chunk (state=3): >>><<< 32935 1726853733.84928: stderr chunk (state=3): >>><<< 32935 1726853733.84946: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853733.819418-33836-15495209422968=/root/.ansible/tmp/ansible-tmp-1726853733.819418-33836-15495209422968 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853733.85028: variable 'ansible_module_compression' from source: unknown 32935 1726853733.85081: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-32935vj31k4ae/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 32935 1726853733.85140: variable 'ansible_facts' from source: unknown 32935 1726853733.85220: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853733.819418-33836-15495209422968/AnsiballZ_stat.py 32935 1726853733.85495: Sending initial data 32935 1726853733.85498: Sent initial data (151 bytes) 32935 1726853733.86117: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853733.86135: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32935 1726853733.86215: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 32935 1726853733.86228: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853733.86245: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853733.86267: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853733.86339: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853733.87889: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32935 1726853733.87946: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32935 1726853733.88012: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32935vj31k4ae/tmpj5yo4dl9 /root/.ansible/tmp/ansible-tmp-1726853733.819418-33836-15495209422968/AnsiballZ_stat.py <<< 32935 1726853733.88015: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853733.819418-33836-15495209422968/AnsiballZ_stat.py" <<< 32935 1726853733.88082: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-32935vj31k4ae/tmpj5yo4dl9" to remote "/root/.ansible/tmp/ansible-tmp-1726853733.819418-33836-15495209422968/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853733.819418-33836-15495209422968/AnsiballZ_stat.py" <<< 32935 1726853733.88834: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853733.89001: stderr chunk (state=3): >>><<< 32935 1726853733.89005: stdout chunk (state=3): >>><<< 32935 1726853733.89007: done transferring module to remote 32935 1726853733.89010: _low_level_execute_command(): starting 32935 1726853733.89012: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853733.819418-33836-15495209422968/ /root/.ansible/tmp/ansible-tmp-1726853733.819418-33836-15495209422968/AnsiballZ_stat.py && sleep 0' 32935 1726853733.89647: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32935 1726853733.89661: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853733.89679: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853733.89728: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853733.89741: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32935 1726853733.89832: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853733.89854: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853733.89933: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853733.91726: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853733.91729: stdout chunk (state=3): >>><<< 32935 1726853733.91732: stderr chunk (state=3): >>><<< 32935 1726853733.91837: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853733.91841: _low_level_execute_command(): starting 32935 1726853733.91845: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853733.819418-33836-15495209422968/AnsiballZ_stat.py && sleep 0' 32935 1726853733.92477: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853733.92480: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853733.92537: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853734.10861: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-lsr101.90", "follow": false, "checksum_algorithm": "sha1"}}} <<< 32935 1726853734.12251: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 32935 1726853734.12255: stdout chunk (state=3): >>><<< 32935 1726853734.12257: stderr chunk (state=3): >>><<< 32935 1726853734.12389: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-lsr101.90", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 32935 1726853734.12393: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-lsr101.90', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853733.819418-33836-15495209422968/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32935 1726853734.12395: _low_level_execute_command(): starting 32935 1726853734.12397: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853733.819418-33836-15495209422968/ > /dev/null 2>&1 && sleep 0' 32935 1726853734.12982: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32935 1726853734.12996: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853734.13026: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853734.13136: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853734.13157: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853734.13233: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853734.15134: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853734.15164: stdout chunk (state=3): >>><<< 32935 1726853734.15168: stderr chunk (state=3): >>><<< 32935 1726853734.15187: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853734.15377: handler run complete 32935 1726853734.15380: attempt loop complete, returning result 32935 1726853734.15383: _execute() done 32935 1726853734.15385: dumping result to json 32935 1726853734.15387: done dumping result, returning 32935 1726853734.15389: done running TaskExecutor() for managed_node1/TASK: Stat profile file [02083763-bbaf-84df-441d-000000000839] 32935 1726853734.15392: sending task result for task 02083763-bbaf-84df-441d-000000000839 32935 1726853734.15466: done sending task result for task 02083763-bbaf-84df-441d-000000000839 32935 1726853734.15469: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 32935 1726853734.15536: no more pending results, returning what we have 32935 1726853734.15540: results queue empty 32935 1726853734.15541: checking for any_errors_fatal 32935 1726853734.15547: done checking for any_errors_fatal 32935 1726853734.15548: checking for max_fail_percentage 32935 1726853734.15550: done checking for max_fail_percentage 32935 1726853734.15551: checking to see if all hosts have failed and the running result is not ok 32935 1726853734.15552: done checking to see if all hosts have failed 32935 1726853734.15553: getting the remaining hosts for this loop 32935 1726853734.15555: done getting the remaining hosts for this loop 32935 1726853734.15558: getting the next task for host managed_node1 32935 1726853734.15566: done getting next task for host managed_node1 32935 1726853734.15569: ^ task is: TASK: Set NM profile exist flag based on the profile files 32935 1726853734.15574: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853734.15579: getting variables 32935 1726853734.15583: in VariableManager get_vars() 32935 1726853734.15634: Calling all_inventory to load vars for managed_node1 32935 1726853734.15637: Calling groups_inventory to load vars for managed_node1 32935 1726853734.15640: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853734.15652: Calling all_plugins_play to load vars for managed_node1 32935 1726853734.15656: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853734.15659: Calling groups_plugins_play to load vars for managed_node1 32935 1726853734.17581: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853734.19193: done with get_vars() 32935 1726853734.19216: done getting variables 32935 1726853734.19280: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 13:35:34 -0400 (0:00:00.451) 0:00:19.328 ****** 32935 1726853734.19311: entering _queue_task() for managed_node1/set_fact 32935 1726853734.19652: worker is 1 (out of 1 available) 32935 1726853734.19663: exiting _queue_task() for managed_node1/set_fact 32935 1726853734.19679: done queuing things up, now waiting for results queue to drain 32935 1726853734.19680: waiting for pending results... 32935 1726853734.20036: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files 32935 1726853734.20136: in run() - task 02083763-bbaf-84df-441d-00000000083a 32935 1726853734.20153: variable 'ansible_search_path' from source: unknown 32935 1726853734.20157: variable 'ansible_search_path' from source: unknown 32935 1726853734.20188: calling self._execute() 32935 1726853734.20263: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853734.20274: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853734.20284: variable 'omit' from source: magic vars 32935 1726853734.20560: variable 'ansible_distribution_major_version' from source: facts 32935 1726853734.20567: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853734.20651: variable 'profile_stat' from source: set_fact 32935 1726853734.20667: Evaluated conditional (profile_stat.stat.exists): False 32935 1726853734.20672: when evaluation is False, skipping this task 32935 1726853734.20675: _execute() done 32935 1726853734.20677: dumping result to json 32935 1726853734.20681: done dumping result, returning 32935 1726853734.20684: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files [02083763-bbaf-84df-441d-00000000083a] 32935 1726853734.20686: sending task result for task 02083763-bbaf-84df-441d-00000000083a 32935 1726853734.20762: done sending task result for task 02083763-bbaf-84df-441d-00000000083a 32935 1726853734.20765: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 32935 1726853734.20842: no more pending results, returning what we have 32935 1726853734.20845: results queue empty 32935 1726853734.20846: checking for any_errors_fatal 32935 1726853734.20854: done checking for any_errors_fatal 32935 1726853734.20855: checking for max_fail_percentage 32935 1726853734.20856: done checking for max_fail_percentage 32935 1726853734.20860: checking to see if all hosts have failed and the running result is not ok 32935 1726853734.20861: done checking to see if all hosts have failed 32935 1726853734.20862: getting the remaining hosts for this loop 32935 1726853734.20863: done getting the remaining hosts for this loop 32935 1726853734.20866: getting the next task for host managed_node1 32935 1726853734.20876: done getting next task for host managed_node1 32935 1726853734.20880: ^ task is: TASK: Get NM profile info 32935 1726853734.20883: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853734.20888: getting variables 32935 1726853734.20889: in VariableManager get_vars() 32935 1726853734.20923: Calling all_inventory to load vars for managed_node1 32935 1726853734.20926: Calling groups_inventory to load vars for managed_node1 32935 1726853734.20928: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853734.20937: Calling all_plugins_play to load vars for managed_node1 32935 1726853734.20939: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853734.20942: Calling groups_plugins_play to load vars for managed_node1 32935 1726853734.21698: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853734.23152: done with get_vars() 32935 1726853734.23170: done getting variables 32935 1726853734.23216: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 13:35:34 -0400 (0:00:00.039) 0:00:19.368 ****** 32935 1726853734.23238: entering _queue_task() for managed_node1/shell 32935 1726853734.23485: worker is 1 (out of 1 available) 32935 1726853734.23499: exiting _queue_task() for managed_node1/shell 32935 1726853734.23513: done queuing things up, now waiting for results queue to drain 32935 1726853734.23515: waiting for pending results... 32935 1726853734.23688: running TaskExecutor() for managed_node1/TASK: Get NM profile info 32935 1726853734.23765: in run() - task 02083763-bbaf-84df-441d-00000000083b 32935 1726853734.23776: variable 'ansible_search_path' from source: unknown 32935 1726853734.23780: variable 'ansible_search_path' from source: unknown 32935 1726853734.23809: calling self._execute() 32935 1726853734.23888: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853734.23891: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853734.23900: variable 'omit' from source: magic vars 32935 1726853734.24168: variable 'ansible_distribution_major_version' from source: facts 32935 1726853734.24183: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853734.24186: variable 'omit' from source: magic vars 32935 1726853734.24216: variable 'omit' from source: magic vars 32935 1726853734.24287: variable 'profile' from source: include params 32935 1726853734.24291: variable 'item' from source: include params 32935 1726853734.24338: variable 'item' from source: include params 32935 1726853734.24353: variable 'omit' from source: magic vars 32935 1726853734.24390: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32935 1726853734.24419: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32935 1726853734.24435: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32935 1726853734.24448: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853734.24460: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853734.24484: variable 'inventory_hostname' from source: host vars for 'managed_node1' 32935 1726853734.24487: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853734.24489: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853734.24561: Set connection var ansible_timeout to 10 32935 1726853734.24565: Set connection var ansible_shell_type to sh 32935 1726853734.24573: Set connection var ansible_pipelining to False 32935 1726853734.24576: Set connection var ansible_connection to ssh 32935 1726853734.24581: Set connection var ansible_shell_executable to /bin/sh 32935 1726853734.24586: Set connection var ansible_module_compression to ZIP_DEFLATED 32935 1726853734.24604: variable 'ansible_shell_executable' from source: unknown 32935 1726853734.24606: variable 'ansible_connection' from source: unknown 32935 1726853734.24610: variable 'ansible_module_compression' from source: unknown 32935 1726853734.24613: variable 'ansible_shell_type' from source: unknown 32935 1726853734.24616: variable 'ansible_shell_executable' from source: unknown 32935 1726853734.24619: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853734.24621: variable 'ansible_pipelining' from source: unknown 32935 1726853734.24624: variable 'ansible_timeout' from source: unknown 32935 1726853734.24626: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853734.24726: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32935 1726853734.24742: variable 'omit' from source: magic vars 32935 1726853734.24746: starting attempt loop 32935 1726853734.24749: running the handler 32935 1726853734.24752: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32935 1726853734.24769: _low_level_execute_command(): starting 32935 1726853734.24777: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32935 1726853734.25496: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853734.25515: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853734.25532: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853734.25621: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853734.27256: stdout chunk (state=3): >>>/root <<< 32935 1726853734.27389: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853734.27402: stderr chunk (state=3): >>><<< 32935 1726853734.27427: stdout chunk (state=3): >>><<< 32935 1726853734.27462: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853734.27545: _low_level_execute_command(): starting 32935 1726853734.27550: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853734.2744977-33859-134925144048945 `" && echo ansible-tmp-1726853734.2744977-33859-134925144048945="` echo /root/.ansible/tmp/ansible-tmp-1726853734.2744977-33859-134925144048945 `" ) && sleep 0' 32935 1726853734.28255: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32935 1726853734.28360: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 32935 1726853734.28365: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853734.28410: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853734.28467: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853734.28483: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853734.30368: stdout chunk (state=3): >>>ansible-tmp-1726853734.2744977-33859-134925144048945=/root/.ansible/tmp/ansible-tmp-1726853734.2744977-33859-134925144048945 <<< 32935 1726853734.30882: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853734.30887: stdout chunk (state=3): >>><<< 32935 1726853734.30889: stderr chunk (state=3): >>><<< 32935 1726853734.30891: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853734.2744977-33859-134925144048945=/root/.ansible/tmp/ansible-tmp-1726853734.2744977-33859-134925144048945 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853734.30894: variable 'ansible_module_compression' from source: unknown 32935 1726853734.30896: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-32935vj31k4ae/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 32935 1726853734.30898: variable 'ansible_facts' from source: unknown 32935 1726853734.30933: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853734.2744977-33859-134925144048945/AnsiballZ_command.py 32935 1726853734.31092: Sending initial data 32935 1726853734.31101: Sent initial data (156 bytes) 32935 1726853734.31669: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32935 1726853734.31687: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853734.31788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853734.31803: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853734.31823: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853734.31892: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853734.33418: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32935 1726853734.33476: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32935 1726853734.33528: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32935vj31k4ae/tmplfjp4fkl /root/.ansible/tmp/ansible-tmp-1726853734.2744977-33859-134925144048945/AnsiballZ_command.py <<< 32935 1726853734.33531: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853734.2744977-33859-134925144048945/AnsiballZ_command.py" <<< 32935 1726853734.33586: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-32935vj31k4ae/tmplfjp4fkl" to remote "/root/.ansible/tmp/ansible-tmp-1726853734.2744977-33859-134925144048945/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853734.2744977-33859-134925144048945/AnsiballZ_command.py" <<< 32935 1726853734.34535: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853734.34538: stderr chunk (state=3): >>><<< 32935 1726853734.34541: stdout chunk (state=3): >>><<< 32935 1726853734.34543: done transferring module to remote 32935 1726853734.34545: _low_level_execute_command(): starting 32935 1726853734.34547: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853734.2744977-33859-134925144048945/ /root/.ansible/tmp/ansible-tmp-1726853734.2744977-33859-134925144048945/AnsiballZ_command.py && sleep 0' 32935 1726853734.35289: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853734.35335: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853734.35364: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853734.35382: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853734.35446: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853734.37243: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853734.37255: stdout chunk (state=3): >>><<< 32935 1726853734.37274: stderr chunk (state=3): >>><<< 32935 1726853734.37295: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853734.37303: _low_level_execute_command(): starting 32935 1726853734.37311: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853734.2744977-33859-134925144048945/AnsiballZ_command.py && sleep 0' 32935 1726853734.37942: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32935 1726853734.37956: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853734.37969: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853734.37992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32935 1726853734.38010: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 32935 1726853734.38029: stderr chunk (state=3): >>>debug2: match not found <<< 32935 1726853734.38046: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853734.38086: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853734.38156: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853734.38177: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853734.38203: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853734.38278: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853734.55370: stdout chunk (state=3): >>> {"changed": true, "stdout": "lsr101.90 /etc/NetworkManager/system-connections/lsr101.90.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep lsr101.90 | grep /etc", "start": "2024-09-20 13:35:34.532928", "end": "2024-09-20 13:35:34.552627", "delta": "0:00:00.019699", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep lsr101.90 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 32935 1726853734.57124: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 32935 1726853734.57129: stdout chunk (state=3): >>><<< 32935 1726853734.57131: stderr chunk (state=3): >>><<< 32935 1726853734.57134: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "lsr101.90 /etc/NetworkManager/system-connections/lsr101.90.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep lsr101.90 | grep /etc", "start": "2024-09-20 13:35:34.532928", "end": "2024-09-20 13:35:34.552627", "delta": "0:00:00.019699", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep lsr101.90 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 32935 1726853734.57136: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep lsr101.90 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853734.2744977-33859-134925144048945/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32935 1726853734.57138: _low_level_execute_command(): starting 32935 1726853734.57140: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853734.2744977-33859-134925144048945/ > /dev/null 2>&1 && sleep 0' 32935 1726853734.57696: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853734.57706: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32935 1726853734.57724: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853734.57730: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 32935 1726853734.57735: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address <<< 32935 1726853734.57776: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 32935 1726853734.57779: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853734.57782: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853734.57866: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853734.57905: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853734.59752: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853734.59756: stdout chunk (state=3): >>><<< 32935 1726853734.59976: stderr chunk (state=3): >>><<< 32935 1726853734.59979: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853734.59981: handler run complete 32935 1726853734.59983: Evaluated conditional (False): False 32935 1726853734.59985: attempt loop complete, returning result 32935 1726853734.59986: _execute() done 32935 1726853734.59988: dumping result to json 32935 1726853734.59989: done dumping result, returning 32935 1726853734.59991: done running TaskExecutor() for managed_node1/TASK: Get NM profile info [02083763-bbaf-84df-441d-00000000083b] 32935 1726853734.59992: sending task result for task 02083763-bbaf-84df-441d-00000000083b 32935 1726853734.60055: done sending task result for task 02083763-bbaf-84df-441d-00000000083b 32935 1726853734.60058: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep lsr101.90 | grep /etc", "delta": "0:00:00.019699", "end": "2024-09-20 13:35:34.552627", "rc": 0, "start": "2024-09-20 13:35:34.532928" } STDOUT: lsr101.90 /etc/NetworkManager/system-connections/lsr101.90.nmconnection 32935 1726853734.60146: no more pending results, returning what we have 32935 1726853734.60149: results queue empty 32935 1726853734.60150: checking for any_errors_fatal 32935 1726853734.60156: done checking for any_errors_fatal 32935 1726853734.60156: checking for max_fail_percentage 32935 1726853734.60161: done checking for max_fail_percentage 32935 1726853734.60162: checking to see if all hosts have failed and the running result is not ok 32935 1726853734.60163: done checking to see if all hosts have failed 32935 1726853734.60163: getting the remaining hosts for this loop 32935 1726853734.60165: done getting the remaining hosts for this loop 32935 1726853734.60169: getting the next task for host managed_node1 32935 1726853734.60182: done getting next task for host managed_node1 32935 1726853734.60185: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 32935 1726853734.60189: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853734.60193: getting variables 32935 1726853734.60195: in VariableManager get_vars() 32935 1726853734.60238: Calling all_inventory to load vars for managed_node1 32935 1726853734.60241: Calling groups_inventory to load vars for managed_node1 32935 1726853734.60243: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853734.60254: Calling all_plugins_play to load vars for managed_node1 32935 1726853734.60257: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853734.60262: Calling groups_plugins_play to load vars for managed_node1 32935 1726853734.61780: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853734.63512: done with get_vars() 32935 1726853734.63544: done getting variables 32935 1726853734.63607: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 13:35:34 -0400 (0:00:00.403) 0:00:19.772 ****** 32935 1726853734.63645: entering _queue_task() for managed_node1/set_fact 32935 1726853734.64008: worker is 1 (out of 1 available) 32935 1726853734.64022: exiting _queue_task() for managed_node1/set_fact 32935 1726853734.64036: done queuing things up, now waiting for results queue to drain 32935 1726853734.64038: waiting for pending results... 32935 1726853734.64228: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 32935 1726853734.64351: in run() - task 02083763-bbaf-84df-441d-00000000083c 32935 1726853734.64369: variable 'ansible_search_path' from source: unknown 32935 1726853734.64379: variable 'ansible_search_path' from source: unknown 32935 1726853734.64417: calling self._execute() 32935 1726853734.64516: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853734.64526: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853734.64539: variable 'omit' from source: magic vars 32935 1726853734.64890: variable 'ansible_distribution_major_version' from source: facts 32935 1726853734.64906: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853734.65032: variable 'nm_profile_exists' from source: set_fact 32935 1726853734.65053: Evaluated conditional (nm_profile_exists.rc == 0): True 32935 1726853734.65066: variable 'omit' from source: magic vars 32935 1726853734.65115: variable 'omit' from source: magic vars 32935 1726853734.65149: variable 'omit' from source: magic vars 32935 1726853734.65194: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32935 1726853734.65233: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32935 1726853734.65259: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32935 1726853734.65285: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853734.65305: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853734.65576: variable 'inventory_hostname' from source: host vars for 'managed_node1' 32935 1726853734.65580: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853734.65582: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853734.65585: Set connection var ansible_timeout to 10 32935 1726853734.65587: Set connection var ansible_shell_type to sh 32935 1726853734.65590: Set connection var ansible_pipelining to False 32935 1726853734.65592: Set connection var ansible_connection to ssh 32935 1726853734.65594: Set connection var ansible_shell_executable to /bin/sh 32935 1726853734.65596: Set connection var ansible_module_compression to ZIP_DEFLATED 32935 1726853734.65598: variable 'ansible_shell_executable' from source: unknown 32935 1726853734.65600: variable 'ansible_connection' from source: unknown 32935 1726853734.65602: variable 'ansible_module_compression' from source: unknown 32935 1726853734.65604: variable 'ansible_shell_type' from source: unknown 32935 1726853734.65605: variable 'ansible_shell_executable' from source: unknown 32935 1726853734.65607: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853734.65609: variable 'ansible_pipelining' from source: unknown 32935 1726853734.65612: variable 'ansible_timeout' from source: unknown 32935 1726853734.65614: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853734.65680: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32935 1726853734.65696: variable 'omit' from source: magic vars 32935 1726853734.65707: starting attempt loop 32935 1726853734.65713: running the handler 32935 1726853734.65729: handler run complete 32935 1726853734.65741: attempt loop complete, returning result 32935 1726853734.65747: _execute() done 32935 1726853734.65753: dumping result to json 32935 1726853734.65759: done dumping result, returning 32935 1726853734.65770: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [02083763-bbaf-84df-441d-00000000083c] 32935 1726853734.65783: sending task result for task 02083763-bbaf-84df-441d-00000000083c 32935 1726853734.65882: done sending task result for task 02083763-bbaf-84df-441d-00000000083c 32935 1726853734.65890: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 32935 1726853734.65943: no more pending results, returning what we have 32935 1726853734.65946: results queue empty 32935 1726853734.65947: checking for any_errors_fatal 32935 1726853734.65954: done checking for any_errors_fatal 32935 1726853734.65955: checking for max_fail_percentage 32935 1726853734.65957: done checking for max_fail_percentage 32935 1726853734.65960: checking to see if all hosts have failed and the running result is not ok 32935 1726853734.65961: done checking to see if all hosts have failed 32935 1726853734.65962: getting the remaining hosts for this loop 32935 1726853734.65963: done getting the remaining hosts for this loop 32935 1726853734.65966: getting the next task for host managed_node1 32935 1726853734.65979: done getting next task for host managed_node1 32935 1726853734.65981: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 32935 1726853734.65985: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853734.66186: getting variables 32935 1726853734.66188: in VariableManager get_vars() 32935 1726853734.66222: Calling all_inventory to load vars for managed_node1 32935 1726853734.66224: Calling groups_inventory to load vars for managed_node1 32935 1726853734.66227: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853734.66237: Calling all_plugins_play to load vars for managed_node1 32935 1726853734.66240: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853734.66243: Calling groups_plugins_play to load vars for managed_node1 32935 1726853734.68374: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853734.70935: done with get_vars() 32935 1726853734.70963: done getting variables 32935 1726853734.71023: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 32935 1726853734.71347: variable 'profile' from source: include params 32935 1726853734.71351: variable 'item' from source: include params 32935 1726853734.71410: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-lsr101.90] ********************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 13:35:34 -0400 (0:00:00.078) 0:00:19.850 ****** 32935 1726853734.71448: entering _queue_task() for managed_node1/command 32935 1726853734.71977: worker is 1 (out of 1 available) 32935 1726853734.71988: exiting _queue_task() for managed_node1/command 32935 1726853734.72000: done queuing things up, now waiting for results queue to drain 32935 1726853734.72001: waiting for pending results... 32935 1726853734.72082: running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-lsr101.90 32935 1726853734.72223: in run() - task 02083763-bbaf-84df-441d-00000000083e 32935 1726853734.72246: variable 'ansible_search_path' from source: unknown 32935 1726853734.72254: variable 'ansible_search_path' from source: unknown 32935 1726853734.72296: calling self._execute() 32935 1726853734.72400: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853734.72413: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853734.72428: variable 'omit' from source: magic vars 32935 1726853734.72802: variable 'ansible_distribution_major_version' from source: facts 32935 1726853734.72820: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853734.72994: variable 'profile_stat' from source: set_fact 32935 1726853734.72999: Evaluated conditional (profile_stat.stat.exists): False 32935 1726853734.73002: when evaluation is False, skipping this task 32935 1726853734.73004: _execute() done 32935 1726853734.73006: dumping result to json 32935 1726853734.73008: done dumping result, returning 32935 1726853734.73010: done running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-lsr101.90 [02083763-bbaf-84df-441d-00000000083e] 32935 1726853734.73012: sending task result for task 02083763-bbaf-84df-441d-00000000083e skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 32935 1726853734.73147: no more pending results, returning what we have 32935 1726853734.73150: results queue empty 32935 1726853734.73152: checking for any_errors_fatal 32935 1726853734.73159: done checking for any_errors_fatal 32935 1726853734.73160: checking for max_fail_percentage 32935 1726853734.73162: done checking for max_fail_percentage 32935 1726853734.73163: checking to see if all hosts have failed and the running result is not ok 32935 1726853734.73164: done checking to see if all hosts have failed 32935 1726853734.73165: getting the remaining hosts for this loop 32935 1726853734.73167: done getting the remaining hosts for this loop 32935 1726853734.73170: getting the next task for host managed_node1 32935 1726853734.73181: done getting next task for host managed_node1 32935 1726853734.73184: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 32935 1726853734.73187: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853734.73192: getting variables 32935 1726853734.73194: in VariableManager get_vars() 32935 1726853734.73235: Calling all_inventory to load vars for managed_node1 32935 1726853734.73238: Calling groups_inventory to load vars for managed_node1 32935 1726853734.73240: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853734.73253: Calling all_plugins_play to load vars for managed_node1 32935 1726853734.73256: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853734.73259: Calling groups_plugins_play to load vars for managed_node1 32935 1726853734.74084: done sending task result for task 02083763-bbaf-84df-441d-00000000083e 32935 1726853734.74087: WORKER PROCESS EXITING 32935 1726853734.74872: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853734.76446: done with get_vars() 32935 1726853734.76469: done getting variables 32935 1726853734.76529: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 32935 1726853734.76636: variable 'profile' from source: include params 32935 1726853734.76640: variable 'item' from source: include params 32935 1726853734.76699: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-lsr101.90] ******************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 13:35:34 -0400 (0:00:00.052) 0:00:19.902 ****** 32935 1726853734.76730: entering _queue_task() for managed_node1/set_fact 32935 1726853734.77043: worker is 1 (out of 1 available) 32935 1726853734.77057: exiting _queue_task() for managed_node1/set_fact 32935 1726853734.77069: done queuing things up, now waiting for results queue to drain 32935 1726853734.77073: waiting for pending results... 32935 1726853734.77663: running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-lsr101.90 32935 1726853734.77964: in run() - task 02083763-bbaf-84df-441d-00000000083f 32935 1726853734.77999: variable 'ansible_search_path' from source: unknown 32935 1726853734.78043: variable 'ansible_search_path' from source: unknown 32935 1726853734.78087: calling self._execute() 32935 1726853734.78576: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853734.78581: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853734.78584: variable 'omit' from source: magic vars 32935 1726853734.79218: variable 'ansible_distribution_major_version' from source: facts 32935 1726853734.79235: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853734.79541: variable 'profile_stat' from source: set_fact 32935 1726853734.79560: Evaluated conditional (profile_stat.stat.exists): False 32935 1726853734.79570: when evaluation is False, skipping this task 32935 1726853734.79580: _execute() done 32935 1726853734.79588: dumping result to json 32935 1726853734.79596: done dumping result, returning 32935 1726853734.79608: done running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-lsr101.90 [02083763-bbaf-84df-441d-00000000083f] 32935 1726853734.79619: sending task result for task 02083763-bbaf-84df-441d-00000000083f skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 32935 1726853734.79784: no more pending results, returning what we have 32935 1726853734.79788: results queue empty 32935 1726853734.79790: checking for any_errors_fatal 32935 1726853734.79796: done checking for any_errors_fatal 32935 1726853734.79797: checking for max_fail_percentage 32935 1726853734.79799: done checking for max_fail_percentage 32935 1726853734.79799: checking to see if all hosts have failed and the running result is not ok 32935 1726853734.79801: done checking to see if all hosts have failed 32935 1726853734.79801: getting the remaining hosts for this loop 32935 1726853734.79803: done getting the remaining hosts for this loop 32935 1726853734.79807: getting the next task for host managed_node1 32935 1726853734.79815: done getting next task for host managed_node1 32935 1726853734.79818: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 32935 1726853734.79822: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853734.79828: getting variables 32935 1726853734.79830: in VariableManager get_vars() 32935 1726853734.79877: Calling all_inventory to load vars for managed_node1 32935 1726853734.79881: Calling groups_inventory to load vars for managed_node1 32935 1726853734.79884: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853734.79899: Calling all_plugins_play to load vars for managed_node1 32935 1726853734.79902: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853734.79905: Calling groups_plugins_play to load vars for managed_node1 32935 1726853734.81179: done sending task result for task 02083763-bbaf-84df-441d-00000000083f 32935 1726853734.81183: WORKER PROCESS EXITING 32935 1726853734.82409: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853734.85866: done with get_vars() 32935 1726853734.86103: done getting variables 32935 1726853734.86163: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 32935 1726853734.86278: variable 'profile' from source: include params 32935 1726853734.86282: variable 'item' from source: include params 32935 1726853734.86339: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-lsr101.90] ************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 13:35:34 -0400 (0:00:00.096) 0:00:19.999 ****** 32935 1726853734.86370: entering _queue_task() for managed_node1/command 32935 1726853734.86730: worker is 1 (out of 1 available) 32935 1726853734.86742: exiting _queue_task() for managed_node1/command 32935 1726853734.86755: done queuing things up, now waiting for results queue to drain 32935 1726853734.86756: waiting for pending results... 32935 1726853734.87042: running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-lsr101.90 32935 1726853734.87178: in run() - task 02083763-bbaf-84df-441d-000000000840 32935 1726853734.87199: variable 'ansible_search_path' from source: unknown 32935 1726853734.87206: variable 'ansible_search_path' from source: unknown 32935 1726853734.87242: calling self._execute() 32935 1726853734.87340: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853734.87350: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853734.87365: variable 'omit' from source: magic vars 32935 1726853734.87743: variable 'ansible_distribution_major_version' from source: facts 32935 1726853734.87762: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853734.87895: variable 'profile_stat' from source: set_fact 32935 1726853734.87912: Evaluated conditional (profile_stat.stat.exists): False 32935 1726853734.87918: when evaluation is False, skipping this task 32935 1726853734.87924: _execute() done 32935 1726853734.87929: dumping result to json 32935 1726853734.87935: done dumping result, returning 32935 1726853734.87943: done running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-lsr101.90 [02083763-bbaf-84df-441d-000000000840] 32935 1726853734.87974: sending task result for task 02083763-bbaf-84df-441d-000000000840 skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 32935 1726853734.88222: no more pending results, returning what we have 32935 1726853734.88227: results queue empty 32935 1726853734.88228: checking for any_errors_fatal 32935 1726853734.88236: done checking for any_errors_fatal 32935 1726853734.88237: checking for max_fail_percentage 32935 1726853734.88239: done checking for max_fail_percentage 32935 1726853734.88239: checking to see if all hosts have failed and the running result is not ok 32935 1726853734.88241: done checking to see if all hosts have failed 32935 1726853734.88241: getting the remaining hosts for this loop 32935 1726853734.88243: done getting the remaining hosts for this loop 32935 1726853734.88246: getting the next task for host managed_node1 32935 1726853734.88256: done getting next task for host managed_node1 32935 1726853734.88258: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 32935 1726853734.88262: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853734.88267: getting variables 32935 1726853734.88268: in VariableManager get_vars() 32935 1726853734.88311: Calling all_inventory to load vars for managed_node1 32935 1726853734.88314: Calling groups_inventory to load vars for managed_node1 32935 1726853734.88317: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853734.88331: Calling all_plugins_play to load vars for managed_node1 32935 1726853734.88334: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853734.88336: Calling groups_plugins_play to load vars for managed_node1 32935 1726853734.88884: done sending task result for task 02083763-bbaf-84df-441d-000000000840 32935 1726853734.88887: WORKER PROCESS EXITING 32935 1726853734.90576: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853734.92923: done with get_vars() 32935 1726853734.92944: done getting variables 32935 1726853734.93005: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 32935 1726853734.93117: variable 'profile' from source: include params 32935 1726853734.93121: variable 'item' from source: include params 32935 1726853734.93181: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-lsr101.90] *********************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 13:35:34 -0400 (0:00:00.068) 0:00:20.067 ****** 32935 1726853734.93211: entering _queue_task() for managed_node1/set_fact 32935 1726853734.93543: worker is 1 (out of 1 available) 32935 1726853734.93554: exiting _queue_task() for managed_node1/set_fact 32935 1726853734.93566: done queuing things up, now waiting for results queue to drain 32935 1726853734.93568: waiting for pending results... 32935 1726853734.93842: running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-lsr101.90 32935 1726853734.93966: in run() - task 02083763-bbaf-84df-441d-000000000841 32935 1726853734.93988: variable 'ansible_search_path' from source: unknown 32935 1726853734.93996: variable 'ansible_search_path' from source: unknown 32935 1726853734.94040: calling self._execute() 32935 1726853734.94148: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853734.94158: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853734.94174: variable 'omit' from source: magic vars 32935 1726853734.94536: variable 'ansible_distribution_major_version' from source: facts 32935 1726853734.94558: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853734.94682: variable 'profile_stat' from source: set_fact 32935 1726853734.94701: Evaluated conditional (profile_stat.stat.exists): False 32935 1726853734.94709: when evaluation is False, skipping this task 32935 1726853734.94717: _execute() done 32935 1726853734.94724: dumping result to json 32935 1726853734.94732: done dumping result, returning 32935 1726853734.94743: done running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-lsr101.90 [02083763-bbaf-84df-441d-000000000841] 32935 1726853734.94753: sending task result for task 02083763-bbaf-84df-441d-000000000841 32935 1726853734.94854: done sending task result for task 02083763-bbaf-84df-441d-000000000841 32935 1726853734.94977: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 32935 1726853734.95021: no more pending results, returning what we have 32935 1726853734.95025: results queue empty 32935 1726853734.95026: checking for any_errors_fatal 32935 1726853734.95032: done checking for any_errors_fatal 32935 1726853734.95033: checking for max_fail_percentage 32935 1726853734.95036: done checking for max_fail_percentage 32935 1726853734.95037: checking to see if all hosts have failed and the running result is not ok 32935 1726853734.95038: done checking to see if all hosts have failed 32935 1726853734.95039: getting the remaining hosts for this loop 32935 1726853734.95040: done getting the remaining hosts for this loop 32935 1726853734.95044: getting the next task for host managed_node1 32935 1726853734.95053: done getting next task for host managed_node1 32935 1726853734.95056: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 32935 1726853734.95059: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853734.95063: getting variables 32935 1726853734.95064: in VariableManager get_vars() 32935 1726853734.95110: Calling all_inventory to load vars for managed_node1 32935 1726853734.95113: Calling groups_inventory to load vars for managed_node1 32935 1726853734.95116: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853734.95131: Calling all_plugins_play to load vars for managed_node1 32935 1726853734.95134: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853734.95137: Calling groups_plugins_play to load vars for managed_node1 32935 1726853734.97750: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853734.99830: done with get_vars() 32935 1726853734.99861: done getting variables 32935 1726853734.99926: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 32935 1726853735.00041: variable 'profile' from source: include params 32935 1726853735.00045: variable 'item' from source: include params 32935 1726853735.00103: variable 'item' from source: include params TASK [Assert that the profile is present - 'lsr101.90'] ************************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 13:35:35 -0400 (0:00:00.069) 0:00:20.136 ****** 32935 1726853735.00132: entering _queue_task() for managed_node1/assert 32935 1726853735.00477: worker is 1 (out of 1 available) 32935 1726853735.00489: exiting _queue_task() for managed_node1/assert 32935 1726853735.00501: done queuing things up, now waiting for results queue to drain 32935 1726853735.00503: waiting for pending results... 32935 1726853735.00770: running TaskExecutor() for managed_node1/TASK: Assert that the profile is present - 'lsr101.90' 32935 1726853735.00910: in run() - task 02083763-bbaf-84df-441d-0000000006c0 32935 1726853735.00930: variable 'ansible_search_path' from source: unknown 32935 1726853735.00938: variable 'ansible_search_path' from source: unknown 32935 1726853735.00982: calling self._execute() 32935 1726853735.01089: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853735.01104: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853735.01121: variable 'omit' from source: magic vars 32935 1726853735.01547: variable 'ansible_distribution_major_version' from source: facts 32935 1726853735.01550: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853735.01553: variable 'omit' from source: magic vars 32935 1726853735.01577: variable 'omit' from source: magic vars 32935 1726853735.01682: variable 'profile' from source: include params 32935 1726853735.01693: variable 'item' from source: include params 32935 1726853735.01760: variable 'item' from source: include params 32935 1726853735.01789: variable 'omit' from source: magic vars 32935 1726853735.01877: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32935 1726853735.01881: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32935 1726853735.01907: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32935 1726853735.01930: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853735.01945: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853735.01981: variable 'inventory_hostname' from source: host vars for 'managed_node1' 32935 1726853735.01993: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853735.02000: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853735.02106: Set connection var ansible_timeout to 10 32935 1726853735.02276: Set connection var ansible_shell_type to sh 32935 1726853735.02279: Set connection var ansible_pipelining to False 32935 1726853735.02282: Set connection var ansible_connection to ssh 32935 1726853735.02284: Set connection var ansible_shell_executable to /bin/sh 32935 1726853735.02286: Set connection var ansible_module_compression to ZIP_DEFLATED 32935 1726853735.02288: variable 'ansible_shell_executable' from source: unknown 32935 1726853735.02290: variable 'ansible_connection' from source: unknown 32935 1726853735.02292: variable 'ansible_module_compression' from source: unknown 32935 1726853735.02294: variable 'ansible_shell_type' from source: unknown 32935 1726853735.02296: variable 'ansible_shell_executable' from source: unknown 32935 1726853735.02298: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853735.02300: variable 'ansible_pipelining' from source: unknown 32935 1726853735.02302: variable 'ansible_timeout' from source: unknown 32935 1726853735.02304: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853735.02357: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32935 1726853735.02379: variable 'omit' from source: magic vars 32935 1726853735.02389: starting attempt loop 32935 1726853735.02395: running the handler 32935 1726853735.02505: variable 'lsr_net_profile_exists' from source: set_fact 32935 1726853735.02517: Evaluated conditional (lsr_net_profile_exists): True 32935 1726853735.02532: handler run complete 32935 1726853735.02552: attempt loop complete, returning result 32935 1726853735.02559: _execute() done 32935 1726853735.02567: dumping result to json 32935 1726853735.02576: done dumping result, returning 32935 1726853735.02588: done running TaskExecutor() for managed_node1/TASK: Assert that the profile is present - 'lsr101.90' [02083763-bbaf-84df-441d-0000000006c0] 32935 1726853735.02596: sending task result for task 02083763-bbaf-84df-441d-0000000006c0 ok: [managed_node1] => { "changed": false } MSG: All assertions passed 32935 1726853735.02825: no more pending results, returning what we have 32935 1726853735.02828: results queue empty 32935 1726853735.02830: checking for any_errors_fatal 32935 1726853735.02837: done checking for any_errors_fatal 32935 1726853735.02838: checking for max_fail_percentage 32935 1726853735.02840: done checking for max_fail_percentage 32935 1726853735.02841: checking to see if all hosts have failed and the running result is not ok 32935 1726853735.02843: done checking to see if all hosts have failed 32935 1726853735.02843: getting the remaining hosts for this loop 32935 1726853735.02845: done getting the remaining hosts for this loop 32935 1726853735.02848: getting the next task for host managed_node1 32935 1726853735.02858: done getting next task for host managed_node1 32935 1726853735.02861: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 32935 1726853735.02864: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853735.02869: getting variables 32935 1726853735.02873: in VariableManager get_vars() 32935 1726853735.02917: Calling all_inventory to load vars for managed_node1 32935 1726853735.02920: Calling groups_inventory to load vars for managed_node1 32935 1726853735.02923: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853735.02935: Calling all_plugins_play to load vars for managed_node1 32935 1726853735.02938: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853735.02941: Calling groups_plugins_play to load vars for managed_node1 32935 1726853735.03517: done sending task result for task 02083763-bbaf-84df-441d-0000000006c0 32935 1726853735.03521: WORKER PROCESS EXITING 32935 1726853735.04618: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853735.06814: done with get_vars() 32935 1726853735.06848: done getting variables 32935 1726853735.06914: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 32935 1726853735.07036: variable 'profile' from source: include params 32935 1726853735.07040: variable 'item' from source: include params 32935 1726853735.07091: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'lsr101.90'] ******* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 13:35:35 -0400 (0:00:00.069) 0:00:20.206 ****** 32935 1726853735.07127: entering _queue_task() for managed_node1/assert 32935 1726853735.07470: worker is 1 (out of 1 available) 32935 1726853735.07484: exiting _queue_task() for managed_node1/assert 32935 1726853735.07497: done queuing things up, now waiting for results queue to drain 32935 1726853735.07498: waiting for pending results... 32935 1726853735.07823: running TaskExecutor() for managed_node1/TASK: Assert that the ansible managed comment is present in 'lsr101.90' 32935 1726853735.07938: in run() - task 02083763-bbaf-84df-441d-0000000006c1 32935 1726853735.07961: variable 'ansible_search_path' from source: unknown 32935 1726853735.07968: variable 'ansible_search_path' from source: unknown 32935 1726853735.08008: calling self._execute() 32935 1726853735.08277: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853735.08281: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853735.08284: variable 'omit' from source: magic vars 32935 1726853735.08917: variable 'ansible_distribution_major_version' from source: facts 32935 1726853735.08992: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853735.09004: variable 'omit' from source: magic vars 32935 1726853735.09092: variable 'omit' from source: magic vars 32935 1726853735.09309: variable 'profile' from source: include params 32935 1726853735.09319: variable 'item' from source: include params 32935 1726853735.09396: variable 'item' from source: include params 32935 1726853735.09425: variable 'omit' from source: magic vars 32935 1726853735.09480: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32935 1726853735.09531: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32935 1726853735.09576: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32935 1726853735.09579: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853735.09593: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853735.09629: variable 'inventory_hostname' from source: host vars for 'managed_node1' 32935 1726853735.09638: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853735.09712: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853735.09750: Set connection var ansible_timeout to 10 32935 1726853735.09761: Set connection var ansible_shell_type to sh 32935 1726853735.09774: Set connection var ansible_pipelining to False 32935 1726853735.09782: Set connection var ansible_connection to ssh 32935 1726853735.09791: Set connection var ansible_shell_executable to /bin/sh 32935 1726853735.09801: Set connection var ansible_module_compression to ZIP_DEFLATED 32935 1726853735.09833: variable 'ansible_shell_executable' from source: unknown 32935 1726853735.09841: variable 'ansible_connection' from source: unknown 32935 1726853735.09847: variable 'ansible_module_compression' from source: unknown 32935 1726853735.09853: variable 'ansible_shell_type' from source: unknown 32935 1726853735.09859: variable 'ansible_shell_executable' from source: unknown 32935 1726853735.09865: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853735.09874: variable 'ansible_pipelining' from source: unknown 32935 1726853735.09881: variable 'ansible_timeout' from source: unknown 32935 1726853735.09888: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853735.10026: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32935 1726853735.10146: variable 'omit' from source: magic vars 32935 1726853735.10149: starting attempt loop 32935 1726853735.10152: running the handler 32935 1726853735.10176: variable 'lsr_net_profile_ansible_managed' from source: set_fact 32935 1726853735.10186: Evaluated conditional (lsr_net_profile_ansible_managed): True 32935 1726853735.10195: handler run complete 32935 1726853735.10213: attempt loop complete, returning result 32935 1726853735.10220: _execute() done 32935 1726853735.10227: dumping result to json 32935 1726853735.10233: done dumping result, returning 32935 1726853735.10244: done running TaskExecutor() for managed_node1/TASK: Assert that the ansible managed comment is present in 'lsr101.90' [02083763-bbaf-84df-441d-0000000006c1] 32935 1726853735.10256: sending task result for task 02083763-bbaf-84df-441d-0000000006c1 ok: [managed_node1] => { "changed": false } MSG: All assertions passed 32935 1726853735.10408: no more pending results, returning what we have 32935 1726853735.10413: results queue empty 32935 1726853735.10414: checking for any_errors_fatal 32935 1726853735.10421: done checking for any_errors_fatal 32935 1726853735.10422: checking for max_fail_percentage 32935 1726853735.10424: done checking for max_fail_percentage 32935 1726853735.10425: checking to see if all hosts have failed and the running result is not ok 32935 1726853735.10427: done checking to see if all hosts have failed 32935 1726853735.10427: getting the remaining hosts for this loop 32935 1726853735.10429: done getting the remaining hosts for this loop 32935 1726853735.10432: getting the next task for host managed_node1 32935 1726853735.10440: done getting next task for host managed_node1 32935 1726853735.10443: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 32935 1726853735.10446: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853735.10451: getting variables 32935 1726853735.10453: in VariableManager get_vars() 32935 1726853735.10496: Calling all_inventory to load vars for managed_node1 32935 1726853735.10499: Calling groups_inventory to load vars for managed_node1 32935 1726853735.10501: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853735.10513: Calling all_plugins_play to load vars for managed_node1 32935 1726853735.10516: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853735.10519: Calling groups_plugins_play to load vars for managed_node1 32935 1726853735.11202: done sending task result for task 02083763-bbaf-84df-441d-0000000006c1 32935 1726853735.11205: WORKER PROCESS EXITING 32935 1726853735.12598: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853735.14665: done with get_vars() 32935 1726853735.14704: done getting variables 32935 1726853735.14760: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 32935 1726853735.14877: variable 'profile' from source: include params 32935 1726853735.14881: variable 'item' from source: include params 32935 1726853735.15068: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in lsr101.90] ************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 13:35:35 -0400 (0:00:00.079) 0:00:20.286 ****** 32935 1726853735.15104: entering _queue_task() for managed_node1/assert 32935 1726853735.15887: worker is 1 (out of 1 available) 32935 1726853735.15977: exiting _queue_task() for managed_node1/assert 32935 1726853735.15991: done queuing things up, now waiting for results queue to drain 32935 1726853735.15993: waiting for pending results... 32935 1726853735.16192: running TaskExecutor() for managed_node1/TASK: Assert that the fingerprint comment is present in lsr101.90 32935 1726853735.16354: in run() - task 02083763-bbaf-84df-441d-0000000006c2 32935 1726853735.16381: variable 'ansible_search_path' from source: unknown 32935 1726853735.16385: variable 'ansible_search_path' from source: unknown 32935 1726853735.16423: calling self._execute() 32935 1726853735.16724: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853735.16727: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853735.16738: variable 'omit' from source: magic vars 32935 1726853735.17320: variable 'ansible_distribution_major_version' from source: facts 32935 1726853735.17324: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853735.17327: variable 'omit' from source: magic vars 32935 1726853735.17330: variable 'omit' from source: magic vars 32935 1726853735.17684: variable 'profile' from source: include params 32935 1726853735.17687: variable 'item' from source: include params 32935 1726853735.17689: variable 'item' from source: include params 32935 1726853735.17692: variable 'omit' from source: magic vars 32935 1726853735.17694: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32935 1726853735.17696: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32935 1726853735.17698: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32935 1726853735.17700: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853735.17703: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853735.17705: variable 'inventory_hostname' from source: host vars for 'managed_node1' 32935 1726853735.17707: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853735.17709: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853735.17802: Set connection var ansible_timeout to 10 32935 1726853735.17809: Set connection var ansible_shell_type to sh 32935 1726853735.17816: Set connection var ansible_pipelining to False 32935 1726853735.17819: Set connection var ansible_connection to ssh 32935 1726853735.17825: Set connection var ansible_shell_executable to /bin/sh 32935 1726853735.17830: Set connection var ansible_module_compression to ZIP_DEFLATED 32935 1726853735.17855: variable 'ansible_shell_executable' from source: unknown 32935 1726853735.17858: variable 'ansible_connection' from source: unknown 32935 1726853735.17863: variable 'ansible_module_compression' from source: unknown 32935 1726853735.17866: variable 'ansible_shell_type' from source: unknown 32935 1726853735.17868: variable 'ansible_shell_executable' from source: unknown 32935 1726853735.17884: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853735.17887: variable 'ansible_pipelining' from source: unknown 32935 1726853735.17890: variable 'ansible_timeout' from source: unknown 32935 1726853735.17895: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853735.18046: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32935 1726853735.18064: variable 'omit' from source: magic vars 32935 1726853735.18067: starting attempt loop 32935 1726853735.18069: running the handler 32935 1726853735.18267: variable 'lsr_net_profile_fingerprint' from source: set_fact 32935 1726853735.18273: Evaluated conditional (lsr_net_profile_fingerprint): True 32935 1726853735.18275: handler run complete 32935 1726853735.18278: attempt loop complete, returning result 32935 1726853735.18280: _execute() done 32935 1726853735.18282: dumping result to json 32935 1726853735.18284: done dumping result, returning 32935 1726853735.18285: done running TaskExecutor() for managed_node1/TASK: Assert that the fingerprint comment is present in lsr101.90 [02083763-bbaf-84df-441d-0000000006c2] 32935 1726853735.18287: sending task result for task 02083763-bbaf-84df-441d-0000000006c2 32935 1726853735.18346: done sending task result for task 02083763-bbaf-84df-441d-0000000006c2 32935 1726853735.18349: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 32935 1726853735.18406: no more pending results, returning what we have 32935 1726853735.18410: results queue empty 32935 1726853735.18411: checking for any_errors_fatal 32935 1726853735.18425: done checking for any_errors_fatal 32935 1726853735.18426: checking for max_fail_percentage 32935 1726853735.18429: done checking for max_fail_percentage 32935 1726853735.18430: checking to see if all hosts have failed and the running result is not ok 32935 1726853735.18431: done checking to see if all hosts have failed 32935 1726853735.18432: getting the remaining hosts for this loop 32935 1726853735.18434: done getting the remaining hosts for this loop 32935 1726853735.18437: getting the next task for host managed_node1 32935 1726853735.18446: done getting next task for host managed_node1 32935 1726853735.18449: ^ task is: TASK: TEARDOWN: remove profiles. 32935 1726853735.18451: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853735.18457: getting variables 32935 1726853735.18461: in VariableManager get_vars() 32935 1726853735.18508: Calling all_inventory to load vars for managed_node1 32935 1726853735.18511: Calling groups_inventory to load vars for managed_node1 32935 1726853735.18516: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853735.18783: Calling all_plugins_play to load vars for managed_node1 32935 1726853735.18787: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853735.18791: Calling groups_plugins_play to load vars for managed_node1 32935 1726853735.20389: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853735.22801: done with get_vars() 32935 1726853735.22830: done getting variables 32935 1726853735.22906: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [TEARDOWN: remove profiles.] ********************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_vlan_mtu.yml:58 Friday 20 September 2024 13:35:35 -0400 (0:00:00.078) 0:00:20.364 ****** 32935 1726853735.22936: entering _queue_task() for managed_node1/debug 32935 1726853735.23323: worker is 1 (out of 1 available) 32935 1726853735.23341: exiting _queue_task() for managed_node1/debug 32935 1726853735.23353: done queuing things up, now waiting for results queue to drain 32935 1726853735.23355: waiting for pending results... 32935 1726853735.23689: running TaskExecutor() for managed_node1/TASK: TEARDOWN: remove profiles. 32935 1726853735.23694: in run() - task 02083763-bbaf-84df-441d-00000000005d 32935 1726853735.23697: variable 'ansible_search_path' from source: unknown 32935 1726853735.23708: calling self._execute() 32935 1726853735.23807: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853735.23818: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853735.23831: variable 'omit' from source: magic vars 32935 1726853735.24176: variable 'ansible_distribution_major_version' from source: facts 32935 1726853735.24194: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853735.24206: variable 'omit' from source: magic vars 32935 1726853735.24231: variable 'omit' from source: magic vars 32935 1726853735.24276: variable 'omit' from source: magic vars 32935 1726853735.24324: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32935 1726853735.24364: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32935 1726853735.24393: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32935 1726853735.24414: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853735.24430: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853735.24462: variable 'inventory_hostname' from source: host vars for 'managed_node1' 32935 1726853735.24472: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853735.24480: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853735.24576: Set connection var ansible_timeout to 10 32935 1726853735.24589: Set connection var ansible_shell_type to sh 32935 1726853735.24600: Set connection var ansible_pipelining to False 32935 1726853735.24606: Set connection var ansible_connection to ssh 32935 1726853735.24615: Set connection var ansible_shell_executable to /bin/sh 32935 1726853735.24625: Set connection var ansible_module_compression to ZIP_DEFLATED 32935 1726853735.24652: variable 'ansible_shell_executable' from source: unknown 32935 1726853735.24659: variable 'ansible_connection' from source: unknown 32935 1726853735.24666: variable 'ansible_module_compression' from source: unknown 32935 1726853735.24876: variable 'ansible_shell_type' from source: unknown 32935 1726853735.24879: variable 'ansible_shell_executable' from source: unknown 32935 1726853735.24881: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853735.24883: variable 'ansible_pipelining' from source: unknown 32935 1726853735.24885: variable 'ansible_timeout' from source: unknown 32935 1726853735.24888: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853735.24890: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32935 1726853735.24893: variable 'omit' from source: magic vars 32935 1726853735.24895: starting attempt loop 32935 1726853735.24897: running the handler 32935 1726853735.24913: handler run complete 32935 1726853735.24935: attempt loop complete, returning result 32935 1726853735.24942: _execute() done 32935 1726853735.24949: dumping result to json 32935 1726853735.24956: done dumping result, returning 32935 1726853735.24967: done running TaskExecutor() for managed_node1/TASK: TEARDOWN: remove profiles. [02083763-bbaf-84df-441d-00000000005d] 32935 1726853735.24978: sending task result for task 02083763-bbaf-84df-441d-00000000005d ok: [managed_node1] => {} MSG: ################################################## 32935 1726853735.25117: no more pending results, returning what we have 32935 1726853735.25121: results queue empty 32935 1726853735.25122: checking for any_errors_fatal 32935 1726853735.25129: done checking for any_errors_fatal 32935 1726853735.25130: checking for max_fail_percentage 32935 1726853735.25131: done checking for max_fail_percentage 32935 1726853735.25132: checking to see if all hosts have failed and the running result is not ok 32935 1726853735.25134: done checking to see if all hosts have failed 32935 1726853735.25134: getting the remaining hosts for this loop 32935 1726853735.25136: done getting the remaining hosts for this loop 32935 1726853735.25139: getting the next task for host managed_node1 32935 1726853735.25147: done getting next task for host managed_node1 32935 1726853735.25152: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 32935 1726853735.25154: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853735.25285: done sending task result for task 02083763-bbaf-84df-441d-00000000005d 32935 1726853735.25289: WORKER PROCESS EXITING 32935 1726853735.25299: getting variables 32935 1726853735.25300: in VariableManager get_vars() 32935 1726853735.25334: Calling all_inventory to load vars for managed_node1 32935 1726853735.25336: Calling groups_inventory to load vars for managed_node1 32935 1726853735.25338: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853735.25346: Calling all_plugins_play to load vars for managed_node1 32935 1726853735.25348: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853735.25350: Calling groups_plugins_play to load vars for managed_node1 32935 1726853735.26825: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853735.27852: done with get_vars() 32935 1726853735.27874: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 13:35:35 -0400 (0:00:00.050) 0:00:20.415 ****** 32935 1726853735.27945: entering _queue_task() for managed_node1/include_tasks 32935 1726853735.28203: worker is 1 (out of 1 available) 32935 1726853735.28232: exiting _queue_task() for managed_node1/include_tasks 32935 1726853735.28245: done queuing things up, now waiting for results queue to drain 32935 1726853735.28247: waiting for pending results... 32935 1726853735.28433: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 32935 1726853735.28525: in run() - task 02083763-bbaf-84df-441d-000000000065 32935 1726853735.28536: variable 'ansible_search_path' from source: unknown 32935 1726853735.28540: variable 'ansible_search_path' from source: unknown 32935 1726853735.28570: calling self._execute() 32935 1726853735.28644: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853735.28648: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853735.28657: variable 'omit' from source: magic vars 32935 1726853735.28955: variable 'ansible_distribution_major_version' from source: facts 32935 1726853735.28966: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853735.28983: _execute() done 32935 1726853735.28986: dumping result to json 32935 1726853735.28989: done dumping result, returning 32935 1726853735.28992: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [02083763-bbaf-84df-441d-000000000065] 32935 1726853735.29023: sending task result for task 02083763-bbaf-84df-441d-000000000065 32935 1726853735.29091: done sending task result for task 02083763-bbaf-84df-441d-000000000065 32935 1726853735.29094: WORKER PROCESS EXITING 32935 1726853735.29219: no more pending results, returning what we have 32935 1726853735.29225: in VariableManager get_vars() 32935 1726853735.29273: Calling all_inventory to load vars for managed_node1 32935 1726853735.29276: Calling groups_inventory to load vars for managed_node1 32935 1726853735.29279: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853735.29292: Calling all_plugins_play to load vars for managed_node1 32935 1726853735.29295: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853735.29298: Calling groups_plugins_play to load vars for managed_node1 32935 1726853735.30592: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853735.31434: done with get_vars() 32935 1726853735.31452: variable 'ansible_search_path' from source: unknown 32935 1726853735.31453: variable 'ansible_search_path' from source: unknown 32935 1726853735.31483: we have included files to process 32935 1726853735.31484: generating all_blocks data 32935 1726853735.31485: done generating all_blocks data 32935 1726853735.31489: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 32935 1726853735.31490: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 32935 1726853735.31491: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 32935 1726853735.31865: done processing included file 32935 1726853735.31866: iterating over new_blocks loaded from include file 32935 1726853735.31867: in VariableManager get_vars() 32935 1726853735.31889: done with get_vars() 32935 1726853735.31891: filtering new block on tags 32935 1726853735.31903: done filtering new block on tags 32935 1726853735.31905: in VariableManager get_vars() 32935 1726853735.31918: done with get_vars() 32935 1726853735.31919: filtering new block on tags 32935 1726853735.31931: done filtering new block on tags 32935 1726853735.31932: in VariableManager get_vars() 32935 1726853735.31945: done with get_vars() 32935 1726853735.31946: filtering new block on tags 32935 1726853735.31955: done filtering new block on tags 32935 1726853735.31956: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node1 32935 1726853735.31962: extending task lists for all hosts with included blocks 32935 1726853735.32580: done extending task lists 32935 1726853735.32582: done processing included files 32935 1726853735.32583: results queue empty 32935 1726853735.32583: checking for any_errors_fatal 32935 1726853735.32586: done checking for any_errors_fatal 32935 1726853735.32587: checking for max_fail_percentage 32935 1726853735.32588: done checking for max_fail_percentage 32935 1726853735.32589: checking to see if all hosts have failed and the running result is not ok 32935 1726853735.32590: done checking to see if all hosts have failed 32935 1726853735.32590: getting the remaining hosts for this loop 32935 1726853735.32591: done getting the remaining hosts for this loop 32935 1726853735.32594: getting the next task for host managed_node1 32935 1726853735.32598: done getting next task for host managed_node1 32935 1726853735.32600: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 32935 1726853735.32603: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853735.32612: getting variables 32935 1726853735.32613: in VariableManager get_vars() 32935 1726853735.32628: Calling all_inventory to load vars for managed_node1 32935 1726853735.32630: Calling groups_inventory to load vars for managed_node1 32935 1726853735.32632: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853735.32638: Calling all_plugins_play to load vars for managed_node1 32935 1726853735.32641: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853735.32644: Calling groups_plugins_play to load vars for managed_node1 32935 1726853735.33834: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853735.34687: done with get_vars() 32935 1726853735.34704: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 13:35:35 -0400 (0:00:00.068) 0:00:20.483 ****** 32935 1726853735.34760: entering _queue_task() for managed_node1/setup 32935 1726853735.35021: worker is 1 (out of 1 available) 32935 1726853735.35034: exiting _queue_task() for managed_node1/setup 32935 1726853735.35047: done queuing things up, now waiting for results queue to drain 32935 1726853735.35049: waiting for pending results... 32935 1726853735.35233: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 32935 1726853735.35335: in run() - task 02083763-bbaf-84df-441d-000000000883 32935 1726853735.35346: variable 'ansible_search_path' from source: unknown 32935 1726853735.35349: variable 'ansible_search_path' from source: unknown 32935 1726853735.35382: calling self._execute() 32935 1726853735.35487: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853735.35490: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853735.35496: variable 'omit' from source: magic vars 32935 1726853735.35983: variable 'ansible_distribution_major_version' from source: facts 32935 1726853735.35986: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853735.36075: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32935 1726853735.38093: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32935 1726853735.38136: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32935 1726853735.38176: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32935 1726853735.38204: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32935 1726853735.38225: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32935 1726853735.38290: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32935 1726853735.38310: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32935 1726853735.38327: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853735.38352: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32935 1726853735.38366: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32935 1726853735.38407: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32935 1726853735.38423: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32935 1726853735.38440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853735.38468: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32935 1726853735.38480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32935 1726853735.38587: variable '__network_required_facts' from source: role '' defaults 32935 1726853735.38593: variable 'ansible_facts' from source: unknown 32935 1726853735.39086: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 32935 1726853735.39090: when evaluation is False, skipping this task 32935 1726853735.39093: _execute() done 32935 1726853735.39095: dumping result to json 32935 1726853735.39098: done dumping result, returning 32935 1726853735.39105: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [02083763-bbaf-84df-441d-000000000883] 32935 1726853735.39107: sending task result for task 02083763-bbaf-84df-441d-000000000883 32935 1726853735.39188: done sending task result for task 02083763-bbaf-84df-441d-000000000883 32935 1726853735.39192: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 32935 1726853735.39237: no more pending results, returning what we have 32935 1726853735.39241: results queue empty 32935 1726853735.39242: checking for any_errors_fatal 32935 1726853735.39243: done checking for any_errors_fatal 32935 1726853735.39243: checking for max_fail_percentage 32935 1726853735.39245: done checking for max_fail_percentage 32935 1726853735.39246: checking to see if all hosts have failed and the running result is not ok 32935 1726853735.39247: done checking to see if all hosts have failed 32935 1726853735.39248: getting the remaining hosts for this loop 32935 1726853735.39249: done getting the remaining hosts for this loop 32935 1726853735.39253: getting the next task for host managed_node1 32935 1726853735.39263: done getting next task for host managed_node1 32935 1726853735.39267: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 32935 1726853735.39270: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853735.39288: getting variables 32935 1726853735.39290: in VariableManager get_vars() 32935 1726853735.39332: Calling all_inventory to load vars for managed_node1 32935 1726853735.39335: Calling groups_inventory to load vars for managed_node1 32935 1726853735.39337: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853735.39347: Calling all_plugins_play to load vars for managed_node1 32935 1726853735.39349: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853735.39352: Calling groups_plugins_play to load vars for managed_node1 32935 1726853735.40753: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853735.42103: done with get_vars() 32935 1726853735.42123: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 13:35:35 -0400 (0:00:00.074) 0:00:20.557 ****** 32935 1726853735.42206: entering _queue_task() for managed_node1/stat 32935 1726853735.42460: worker is 1 (out of 1 available) 32935 1726853735.42476: exiting _queue_task() for managed_node1/stat 32935 1726853735.42488: done queuing things up, now waiting for results queue to drain 32935 1726853735.42490: waiting for pending results... 32935 1726853735.42668: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 32935 1726853735.42774: in run() - task 02083763-bbaf-84df-441d-000000000885 32935 1726853735.42785: variable 'ansible_search_path' from source: unknown 32935 1726853735.42788: variable 'ansible_search_path' from source: unknown 32935 1726853735.42816: calling self._execute() 32935 1726853735.42891: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853735.42895: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853735.42904: variable 'omit' from source: magic vars 32935 1726853735.43181: variable 'ansible_distribution_major_version' from source: facts 32935 1726853735.43191: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853735.43303: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 32935 1726853735.43498: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 32935 1726853735.43535: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 32935 1726853735.43577: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 32935 1726853735.43628: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 32935 1726853735.43844: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 32935 1726853735.43847: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 32935 1726853735.43881: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853735.43904: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 32935 1726853735.43984: variable '__network_is_ostree' from source: set_fact 32935 1726853735.43991: Evaluated conditional (not __network_is_ostree is defined): False 32935 1726853735.43994: when evaluation is False, skipping this task 32935 1726853735.43997: _execute() done 32935 1726853735.44000: dumping result to json 32935 1726853735.44002: done dumping result, returning 32935 1726853735.44010: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [02083763-bbaf-84df-441d-000000000885] 32935 1726853735.44015: sending task result for task 02083763-bbaf-84df-441d-000000000885 32935 1726853735.44320: done sending task result for task 02083763-bbaf-84df-441d-000000000885 32935 1726853735.44322: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 32935 1726853735.44413: no more pending results, returning what we have 32935 1726853735.44416: results queue empty 32935 1726853735.44417: checking for any_errors_fatal 32935 1726853735.44422: done checking for any_errors_fatal 32935 1726853735.44422: checking for max_fail_percentage 32935 1726853735.44424: done checking for max_fail_percentage 32935 1726853735.44425: checking to see if all hosts have failed and the running result is not ok 32935 1726853735.44426: done checking to see if all hosts have failed 32935 1726853735.44427: getting the remaining hosts for this loop 32935 1726853735.44428: done getting the remaining hosts for this loop 32935 1726853735.44436: getting the next task for host managed_node1 32935 1726853735.44442: done getting next task for host managed_node1 32935 1726853735.44446: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 32935 1726853735.44450: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853735.44469: getting variables 32935 1726853735.44473: in VariableManager get_vars() 32935 1726853735.44512: Calling all_inventory to load vars for managed_node1 32935 1726853735.44515: Calling groups_inventory to load vars for managed_node1 32935 1726853735.44517: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853735.44526: Calling all_plugins_play to load vars for managed_node1 32935 1726853735.44528: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853735.44531: Calling groups_plugins_play to load vars for managed_node1 32935 1726853735.50772: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853735.52057: done with get_vars() 32935 1726853735.52082: done getting variables 32935 1726853735.52119: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 13:35:35 -0400 (0:00:00.099) 0:00:20.657 ****** 32935 1726853735.52143: entering _queue_task() for managed_node1/set_fact 32935 1726853735.52412: worker is 1 (out of 1 available) 32935 1726853735.52427: exiting _queue_task() for managed_node1/set_fact 32935 1726853735.52439: done queuing things up, now waiting for results queue to drain 32935 1726853735.52441: waiting for pending results... 32935 1726853735.52625: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 32935 1726853735.52742: in run() - task 02083763-bbaf-84df-441d-000000000886 32935 1726853735.52752: variable 'ansible_search_path' from source: unknown 32935 1726853735.52755: variable 'ansible_search_path' from source: unknown 32935 1726853735.52788: calling self._execute() 32935 1726853735.52862: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853735.52865: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853735.52874: variable 'omit' from source: magic vars 32935 1726853735.53161: variable 'ansible_distribution_major_version' from source: facts 32935 1726853735.53168: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853735.53286: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 32935 1726853735.53540: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 32935 1726853735.53677: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 32935 1726853735.53681: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 32935 1726853735.53709: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 32935 1726853735.53890: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 32935 1726853735.53894: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 32935 1726853735.53897: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853735.53899: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 32935 1726853735.53989: variable '__network_is_ostree' from source: set_fact 32935 1726853735.53997: Evaluated conditional (not __network_is_ostree is defined): False 32935 1726853735.54001: when evaluation is False, skipping this task 32935 1726853735.54003: _execute() done 32935 1726853735.54006: dumping result to json 32935 1726853735.54009: done dumping result, returning 32935 1726853735.54018: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [02083763-bbaf-84df-441d-000000000886] 32935 1726853735.54020: sending task result for task 02083763-bbaf-84df-441d-000000000886 32935 1726853735.54116: done sending task result for task 02083763-bbaf-84df-441d-000000000886 32935 1726853735.54119: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 32935 1726853735.54317: no more pending results, returning what we have 32935 1726853735.54320: results queue empty 32935 1726853735.54321: checking for any_errors_fatal 32935 1726853735.54326: done checking for any_errors_fatal 32935 1726853735.54327: checking for max_fail_percentage 32935 1726853735.54328: done checking for max_fail_percentage 32935 1726853735.54329: checking to see if all hosts have failed and the running result is not ok 32935 1726853735.54330: done checking to see if all hosts have failed 32935 1726853735.54331: getting the remaining hosts for this loop 32935 1726853735.54332: done getting the remaining hosts for this loop 32935 1726853735.54336: getting the next task for host managed_node1 32935 1726853735.54345: done getting next task for host managed_node1 32935 1726853735.54349: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 32935 1726853735.54352: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853735.54368: getting variables 32935 1726853735.54370: in VariableManager get_vars() 32935 1726853735.54413: Calling all_inventory to load vars for managed_node1 32935 1726853735.54416: Calling groups_inventory to load vars for managed_node1 32935 1726853735.54418: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853735.54428: Calling all_plugins_play to load vars for managed_node1 32935 1726853735.54431: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853735.54435: Calling groups_plugins_play to load vars for managed_node1 32935 1726853735.55376: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853735.56250: done with get_vars() 32935 1726853735.56269: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 13:35:35 -0400 (0:00:00.041) 0:00:20.699 ****** 32935 1726853735.56341: entering _queue_task() for managed_node1/service_facts 32935 1726853735.56594: worker is 1 (out of 1 available) 32935 1726853735.56608: exiting _queue_task() for managed_node1/service_facts 32935 1726853735.56623: done queuing things up, now waiting for results queue to drain 32935 1726853735.56625: waiting for pending results... 32935 1726853735.56893: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running 32935 1726853735.57027: in run() - task 02083763-bbaf-84df-441d-000000000888 32935 1726853735.57047: variable 'ansible_search_path' from source: unknown 32935 1726853735.57053: variable 'ansible_search_path' from source: unknown 32935 1726853735.57102: calling self._execute() 32935 1726853735.57218: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853735.57235: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853735.57250: variable 'omit' from source: magic vars 32935 1726853735.57687: variable 'ansible_distribution_major_version' from source: facts 32935 1726853735.57704: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853735.57754: variable 'omit' from source: magic vars 32935 1726853735.57803: variable 'omit' from source: magic vars 32935 1726853735.57843: variable 'omit' from source: magic vars 32935 1726853735.57898: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32935 1726853735.57940: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32935 1726853735.57974: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32935 1726853735.58079: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853735.58082: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853735.58085: variable 'inventory_hostname' from source: host vars for 'managed_node1' 32935 1726853735.58092: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853735.58095: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853735.58175: Set connection var ansible_timeout to 10 32935 1726853735.58194: Set connection var ansible_shell_type to sh 32935 1726853735.58202: Set connection var ansible_pipelining to False 32935 1726853735.58205: Set connection var ansible_connection to ssh 32935 1726853735.58213: Set connection var ansible_shell_executable to /bin/sh 32935 1726853735.58218: Set connection var ansible_module_compression to ZIP_DEFLATED 32935 1726853735.58238: variable 'ansible_shell_executable' from source: unknown 32935 1726853735.58241: variable 'ansible_connection' from source: unknown 32935 1726853735.58244: variable 'ansible_module_compression' from source: unknown 32935 1726853735.58246: variable 'ansible_shell_type' from source: unknown 32935 1726853735.58248: variable 'ansible_shell_executable' from source: unknown 32935 1726853735.58251: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853735.58253: variable 'ansible_pipelining' from source: unknown 32935 1726853735.58255: variable 'ansible_timeout' from source: unknown 32935 1726853735.58262: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853735.58429: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 32935 1726853735.58434: variable 'omit' from source: magic vars 32935 1726853735.58440: starting attempt loop 32935 1726853735.58443: running the handler 32935 1726853735.58455: _low_level_execute_command(): starting 32935 1726853735.58463: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32935 1726853735.58953: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853735.58990: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853735.58993: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853735.58996: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853735.59046: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853735.59049: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853735.59053: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853735.59102: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853735.60802: stdout chunk (state=3): >>>/root <<< 32935 1726853735.60948: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853735.60951: stdout chunk (state=3): >>><<< 32935 1726853735.60953: stderr chunk (state=3): >>><<< 32935 1726853735.61075: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853735.61079: _low_level_execute_command(): starting 32935 1726853735.61081: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853735.6098478-33922-67055987708789 `" && echo ansible-tmp-1726853735.6098478-33922-67055987708789="` echo /root/.ansible/tmp/ansible-tmp-1726853735.6098478-33922-67055987708789 `" ) && sleep 0' 32935 1726853735.61577: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32935 1726853735.61599: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853735.61654: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853735.61658: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853735.61704: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853735.63580: stdout chunk (state=3): >>>ansible-tmp-1726853735.6098478-33922-67055987708789=/root/.ansible/tmp/ansible-tmp-1726853735.6098478-33922-67055987708789 <<< 32935 1726853735.63737: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853735.63741: stdout chunk (state=3): >>><<< 32935 1726853735.63744: stderr chunk (state=3): >>><<< 32935 1726853735.63761: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853735.6098478-33922-67055987708789=/root/.ansible/tmp/ansible-tmp-1726853735.6098478-33922-67055987708789 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853735.63884: variable 'ansible_module_compression' from source: unknown 32935 1726853735.63923: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-32935vj31k4ae/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 32935 1726853735.63973: variable 'ansible_facts' from source: unknown 32935 1726853735.64033: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853735.6098478-33922-67055987708789/AnsiballZ_service_facts.py 32935 1726853735.64135: Sending initial data 32935 1726853735.64139: Sent initial data (161 bytes) 32935 1726853735.64591: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853735.64594: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32935 1726853735.64597: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853735.64599: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration <<< 32935 1726853735.64601: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 32935 1726853735.64603: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853735.64658: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853735.64661: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853735.64698: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853735.66224: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 32935 1726853735.66231: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32935 1726853735.66264: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32935 1726853735.66305: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32935vj31k4ae/tmp9pqf_fyl /root/.ansible/tmp/ansible-tmp-1726853735.6098478-33922-67055987708789/AnsiballZ_service_facts.py <<< 32935 1726853735.66312: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853735.6098478-33922-67055987708789/AnsiballZ_service_facts.py" <<< 32935 1726853735.66347: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-32935vj31k4ae/tmp9pqf_fyl" to remote "/root/.ansible/tmp/ansible-tmp-1726853735.6098478-33922-67055987708789/AnsiballZ_service_facts.py" <<< 32935 1726853735.66349: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853735.6098478-33922-67055987708789/AnsiballZ_service_facts.py" <<< 32935 1726853735.66884: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853735.66924: stderr chunk (state=3): >>><<< 32935 1726853735.66927: stdout chunk (state=3): >>><<< 32935 1726853735.66988: done transferring module to remote 32935 1726853735.66997: _low_level_execute_command(): starting 32935 1726853735.67002: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853735.6098478-33922-67055987708789/ /root/.ansible/tmp/ansible-tmp-1726853735.6098478-33922-67055987708789/AnsiballZ_service_facts.py && sleep 0' 32935 1726853735.67450: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853735.67454: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 32935 1726853735.67456: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 32935 1726853735.67458: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853735.67462: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853735.67515: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853735.67524: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853735.67526: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853735.67560: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853735.69280: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853735.69311: stderr chunk (state=3): >>><<< 32935 1726853735.69314: stdout chunk (state=3): >>><<< 32935 1726853735.69328: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853735.69331: _low_level_execute_command(): starting 32935 1726853735.69335: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853735.6098478-33922-67055987708789/AnsiballZ_service_facts.py && sleep 0' 32935 1726853735.69764: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853735.69768: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32935 1726853735.69798: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 32935 1726853735.69801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853735.69803: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853735.69805: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853735.69860: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853735.69864: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853735.69867: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853735.69921: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853737.23184: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "nmstate.service": {"name": "nmstate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 32935 1726853737.24597: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 32935 1726853737.24618: stderr chunk (state=3): >>><<< 32935 1726853737.24621: stdout chunk (state=3): >>><<< 32935 1726853737.24651: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "nmstate.service": {"name": "nmstate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 32935 1726853737.25872: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853735.6098478-33922-67055987708789/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32935 1726853737.25880: _low_level_execute_command(): starting 32935 1726853737.25886: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853735.6098478-33922-67055987708789/ > /dev/null 2>&1 && sleep 0' 32935 1726853737.27008: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32935 1726853737.27035: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853737.27049: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853737.27078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32935 1726853737.27109: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 32935 1726853737.27120: stderr chunk (state=3): >>>debug2: match not found <<< 32935 1726853737.27133: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853737.27187: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32935 1726853737.27199: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 32935 1726853737.27281: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853737.27494: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853737.27566: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853737.29577: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853737.29581: stdout chunk (state=3): >>><<< 32935 1726853737.29584: stderr chunk (state=3): >>><<< 32935 1726853737.29587: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853737.29589: handler run complete 32935 1726853737.29721: variable 'ansible_facts' from source: unknown 32935 1726853737.29946: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853737.30742: variable 'ansible_facts' from source: unknown 32935 1726853737.30911: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853737.31145: attempt loop complete, returning result 32935 1726853737.31155: _execute() done 32935 1726853737.31162: dumping result to json 32935 1726853737.31233: done dumping result, returning 32935 1726853737.31254: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running [02083763-bbaf-84df-441d-000000000888] 32935 1726853737.31264: sending task result for task 02083763-bbaf-84df-441d-000000000888 32935 1726853737.32317: done sending task result for task 02083763-bbaf-84df-441d-000000000888 32935 1726853737.32320: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 32935 1726853737.32375: no more pending results, returning what we have 32935 1726853737.32377: results queue empty 32935 1726853737.32378: checking for any_errors_fatal 32935 1726853737.32380: done checking for any_errors_fatal 32935 1726853737.32380: checking for max_fail_percentage 32935 1726853737.32381: done checking for max_fail_percentage 32935 1726853737.32382: checking to see if all hosts have failed and the running result is not ok 32935 1726853737.32383: done checking to see if all hosts have failed 32935 1726853737.32383: getting the remaining hosts for this loop 32935 1726853737.32384: done getting the remaining hosts for this loop 32935 1726853737.32386: getting the next task for host managed_node1 32935 1726853737.32390: done getting next task for host managed_node1 32935 1726853737.32392: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 32935 1726853737.32396: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853737.32405: getting variables 32935 1726853737.32406: in VariableManager get_vars() 32935 1726853737.32427: Calling all_inventory to load vars for managed_node1 32935 1726853737.32429: Calling groups_inventory to load vars for managed_node1 32935 1726853737.32431: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853737.32438: Calling all_plugins_play to load vars for managed_node1 32935 1726853737.32440: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853737.32442: Calling groups_plugins_play to load vars for managed_node1 32935 1726853737.33112: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853737.34342: done with get_vars() 32935 1726853737.34367: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 13:35:37 -0400 (0:00:01.781) 0:00:22.480 ****** 32935 1726853737.34481: entering _queue_task() for managed_node1/package_facts 32935 1726853737.34782: worker is 1 (out of 1 available) 32935 1726853737.34797: exiting _queue_task() for managed_node1/package_facts 32935 1726853737.34810: done queuing things up, now waiting for results queue to drain 32935 1726853737.34811: waiting for pending results... 32935 1726853737.35005: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 32935 1726853737.35121: in run() - task 02083763-bbaf-84df-441d-000000000889 32935 1726853737.35131: variable 'ansible_search_path' from source: unknown 32935 1726853737.35134: variable 'ansible_search_path' from source: unknown 32935 1726853737.35168: calling self._execute() 32935 1726853737.35243: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853737.35247: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853737.35262: variable 'omit' from source: magic vars 32935 1726853737.35543: variable 'ansible_distribution_major_version' from source: facts 32935 1726853737.35553: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853737.35558: variable 'omit' from source: magic vars 32935 1726853737.35611: variable 'omit' from source: magic vars 32935 1726853737.35638: variable 'omit' from source: magic vars 32935 1726853737.35675: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32935 1726853737.35703: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32935 1726853737.35720: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32935 1726853737.35733: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853737.35742: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853737.35768: variable 'inventory_hostname' from source: host vars for 'managed_node1' 32935 1726853737.35772: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853737.35775: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853737.35846: Set connection var ansible_timeout to 10 32935 1726853737.35850: Set connection var ansible_shell_type to sh 32935 1726853737.35857: Set connection var ansible_pipelining to False 32935 1726853737.35860: Set connection var ansible_connection to ssh 32935 1726853737.35867: Set connection var ansible_shell_executable to /bin/sh 32935 1726853737.35873: Set connection var ansible_module_compression to ZIP_DEFLATED 32935 1726853737.35891: variable 'ansible_shell_executable' from source: unknown 32935 1726853737.35894: variable 'ansible_connection' from source: unknown 32935 1726853737.35896: variable 'ansible_module_compression' from source: unknown 32935 1726853737.35898: variable 'ansible_shell_type' from source: unknown 32935 1726853737.35901: variable 'ansible_shell_executable' from source: unknown 32935 1726853737.35903: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853737.35907: variable 'ansible_pipelining' from source: unknown 32935 1726853737.35909: variable 'ansible_timeout' from source: unknown 32935 1726853737.35918: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853737.36055: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 32935 1726853737.36066: variable 'omit' from source: magic vars 32935 1726853737.36070: starting attempt loop 32935 1726853737.36080: running the handler 32935 1726853737.36090: _low_level_execute_command(): starting 32935 1726853737.36098: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32935 1726853737.36604: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853737.36609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853737.36612: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853737.36669: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853737.36678: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853737.36748: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853737.38328: stdout chunk (state=3): >>>/root <<< 32935 1726853737.38451: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853737.38453: stdout chunk (state=3): >>><<< 32935 1726853737.38455: stderr chunk (state=3): >>><<< 32935 1726853737.38479: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853737.38541: _low_level_execute_command(): starting 32935 1726853737.38545: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853737.384839-34002-263648805133016 `" && echo ansible-tmp-1726853737.384839-34002-263648805133016="` echo /root/.ansible/tmp/ansible-tmp-1726853737.384839-34002-263648805133016 `" ) && sleep 0' 32935 1726853737.38908: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853737.38911: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 32935 1726853737.38914: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853737.38923: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853737.38970: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853737.38978: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853737.39015: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853737.40864: stdout chunk (state=3): >>>ansible-tmp-1726853737.384839-34002-263648805133016=/root/.ansible/tmp/ansible-tmp-1726853737.384839-34002-263648805133016 <<< 32935 1726853737.40974: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853737.41001: stderr chunk (state=3): >>><<< 32935 1726853737.41003: stdout chunk (state=3): >>><<< 32935 1726853737.41013: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853737.384839-34002-263648805133016=/root/.ansible/tmp/ansible-tmp-1726853737.384839-34002-263648805133016 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853737.41076: variable 'ansible_module_compression' from source: unknown 32935 1726853737.41096: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-32935vj31k4ae/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 32935 1726853737.41146: variable 'ansible_facts' from source: unknown 32935 1726853737.41272: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853737.384839-34002-263648805133016/AnsiballZ_package_facts.py 32935 1726853737.41376: Sending initial data 32935 1726853737.41380: Sent initial data (161 bytes) 32935 1726853737.41798: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 32935 1726853737.41802: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853737.41804: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853737.41856: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853737.41860: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853737.41904: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853737.43460: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 32935 1726853737.43468: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32935 1726853737.43498: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32935 1726853737.43538: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32935vj31k4ae/tmp65m8rrk5 /root/.ansible/tmp/ansible-tmp-1726853737.384839-34002-263648805133016/AnsiballZ_package_facts.py <<< 32935 1726853737.43545: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853737.384839-34002-263648805133016/AnsiballZ_package_facts.py" <<< 32935 1726853737.43580: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-32935vj31k4ae/tmp65m8rrk5" to remote "/root/.ansible/tmp/ansible-tmp-1726853737.384839-34002-263648805133016/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853737.384839-34002-263648805133016/AnsiballZ_package_facts.py" <<< 32935 1726853737.44619: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853737.44655: stderr chunk (state=3): >>><<< 32935 1726853737.44661: stdout chunk (state=3): >>><<< 32935 1726853737.44699: done transferring module to remote 32935 1726853737.44708: _low_level_execute_command(): starting 32935 1726853737.44713: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853737.384839-34002-263648805133016/ /root/.ansible/tmp/ansible-tmp-1726853737.384839-34002-263648805133016/AnsiballZ_package_facts.py && sleep 0' 32935 1726853737.45133: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853737.45136: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853737.45139: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration <<< 32935 1726853737.45141: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 32935 1726853737.45143: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853737.45195: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853737.45202: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853737.45239: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853737.46983: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853737.47010: stderr chunk (state=3): >>><<< 32935 1726853737.47013: stdout chunk (state=3): >>><<< 32935 1726853737.47023: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853737.47026: _low_level_execute_command(): starting 32935 1726853737.47032: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853737.384839-34002-263648805133016/AnsiballZ_package_facts.py && sleep 0' 32935 1726853737.47492: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853737.47495: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853737.47498: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853737.47500: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 32935 1726853737.47502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853737.47551: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853737.47558: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853737.47561: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853737.47600: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853737.91938: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 32935 1726853737.91949: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 32935 1726853737.92021: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10",<<< 32935 1726853737.92073: stdout chunk (state=3): >>> "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arc<<< 32935 1726853737.92092: stdout chunk (state=3): >>>h": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"<<< 32935 1726853737.92104: stdout chunk (state=3): >>>}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-resc<<< 32935 1726853737.92108: stdout chunk (state=3): >>>ue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "r<<< 32935 1726853737.92118: stdout chunk (state=3): >>>pm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10<<< 32935 1726853737.92147: stdout chunk (state=3): >>>", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.<<< 32935 1726853737.92156: stdout chunk (state=3): >>>26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nmstate": [{"name": "nmstate", "version": "2.2.35", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-config-server": [{"name": "NetworkManager-config-server", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "nmstate-libs": [{"name": "nmstate-libs", "version": "2.2.35", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libnmstate": [{"name": "python3-libnmstate", "version": "2.2.35", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 32935 1726853737.93951: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 32935 1726853737.93954: stdout chunk (state=3): >>><<< 32935 1726853737.93956: stderr chunk (state=3): >>><<< 32935 1726853737.94182: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nmstate": [{"name": "nmstate", "version": "2.2.35", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-config-server": [{"name": "NetworkManager-config-server", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "nmstate-libs": [{"name": "nmstate-libs", "version": "2.2.35", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libnmstate": [{"name": "python3-libnmstate", "version": "2.2.35", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 32935 1726853737.96340: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853737.384839-34002-263648805133016/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32935 1726853737.96368: _low_level_execute_command(): starting 32935 1726853737.96386: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853737.384839-34002-263648805133016/ > /dev/null 2>&1 && sleep 0' 32935 1726853737.97094: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853737.97151: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853737.97174: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853737.97199: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853737.97268: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853737.99185: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853737.99197: stdout chunk (state=3): >>><<< 32935 1726853737.99210: stderr chunk (state=3): >>><<< 32935 1726853737.99227: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853737.99238: handler run complete 32935 1726853738.00104: variable 'ansible_facts' from source: unknown 32935 1726853738.01267: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853738.05431: variable 'ansible_facts' from source: unknown 32935 1726853738.06392: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853738.07133: attempt loop complete, returning result 32935 1726853738.07153: _execute() done 32935 1726853738.07164: dumping result to json 32935 1726853738.07386: done dumping result, returning 32935 1726853738.07400: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [02083763-bbaf-84df-441d-000000000889] 32935 1726853738.07408: sending task result for task 02083763-bbaf-84df-441d-000000000889 32935 1726853738.10031: done sending task result for task 02083763-bbaf-84df-441d-000000000889 32935 1726853738.10034: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 32935 1726853738.10194: no more pending results, returning what we have 32935 1726853738.10197: results queue empty 32935 1726853738.10198: checking for any_errors_fatal 32935 1726853738.10203: done checking for any_errors_fatal 32935 1726853738.10204: checking for max_fail_percentage 32935 1726853738.10205: done checking for max_fail_percentage 32935 1726853738.10206: checking to see if all hosts have failed and the running result is not ok 32935 1726853738.10207: done checking to see if all hosts have failed 32935 1726853738.10208: getting the remaining hosts for this loop 32935 1726853738.10209: done getting the remaining hosts for this loop 32935 1726853738.10212: getting the next task for host managed_node1 32935 1726853738.10219: done getting next task for host managed_node1 32935 1726853738.10222: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 32935 1726853738.10225: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853738.10236: getting variables 32935 1726853738.10238: in VariableManager get_vars() 32935 1726853738.10474: Calling all_inventory to load vars for managed_node1 32935 1726853738.10478: Calling groups_inventory to load vars for managed_node1 32935 1726853738.10481: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853738.10490: Calling all_plugins_play to load vars for managed_node1 32935 1726853738.10492: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853738.10495: Calling groups_plugins_play to load vars for managed_node1 32935 1726853738.12913: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853738.15291: done with get_vars() 32935 1726853738.15325: done getting variables 32935 1726853738.15394: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 13:35:38 -0400 (0:00:00.809) 0:00:23.289 ****** 32935 1726853738.15434: entering _queue_task() for managed_node1/debug 32935 1726853738.15899: worker is 1 (out of 1 available) 32935 1726853738.15913: exiting _queue_task() for managed_node1/debug 32935 1726853738.15924: done queuing things up, now waiting for results queue to drain 32935 1726853738.15926: waiting for pending results... 32935 1726853738.16193: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 32935 1726853738.16310: in run() - task 02083763-bbaf-84df-441d-000000000066 32935 1726853738.16327: variable 'ansible_search_path' from source: unknown 32935 1726853738.16333: variable 'ansible_search_path' from source: unknown 32935 1726853738.16374: calling self._execute() 32935 1726853738.16478: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853738.16520: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853738.16524: variable 'omit' from source: magic vars 32935 1726853738.16911: variable 'ansible_distribution_major_version' from source: facts 32935 1726853738.16956: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853738.16964: variable 'omit' from source: magic vars 32935 1726853738.17004: variable 'omit' from source: magic vars 32935 1726853738.17178: variable 'network_provider' from source: set_fact 32935 1726853738.17183: variable 'omit' from source: magic vars 32935 1726853738.17192: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32935 1726853738.17235: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32935 1726853738.17266: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32935 1726853738.17308: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853738.17325: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853738.17361: variable 'inventory_hostname' from source: host vars for 'managed_node1' 32935 1726853738.17373: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853738.17677: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853738.17680: Set connection var ansible_timeout to 10 32935 1726853738.17682: Set connection var ansible_shell_type to sh 32935 1726853738.17895: Set connection var ansible_pipelining to False 32935 1726853738.17899: Set connection var ansible_connection to ssh 32935 1726853738.17901: Set connection var ansible_shell_executable to /bin/sh 32935 1726853738.17903: Set connection var ansible_module_compression to ZIP_DEFLATED 32935 1726853738.17905: variable 'ansible_shell_executable' from source: unknown 32935 1726853738.17908: variable 'ansible_connection' from source: unknown 32935 1726853738.17910: variable 'ansible_module_compression' from source: unknown 32935 1726853738.17912: variable 'ansible_shell_type' from source: unknown 32935 1726853738.17914: variable 'ansible_shell_executable' from source: unknown 32935 1726853738.17916: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853738.17918: variable 'ansible_pipelining' from source: unknown 32935 1726853738.17920: variable 'ansible_timeout' from source: unknown 32935 1726853738.17922: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853738.18136: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32935 1726853738.18191: variable 'omit' from source: magic vars 32935 1726853738.18202: starting attempt loop 32935 1726853738.18209: running the handler 32935 1726853738.18332: handler run complete 32935 1726853738.18352: attempt loop complete, returning result 32935 1726853738.18363: _execute() done 32935 1726853738.18476: dumping result to json 32935 1726853738.18479: done dumping result, returning 32935 1726853738.18482: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [02083763-bbaf-84df-441d-000000000066] 32935 1726853738.18485: sending task result for task 02083763-bbaf-84df-441d-000000000066 32935 1726853738.18876: done sending task result for task 02083763-bbaf-84df-441d-000000000066 32935 1726853738.18880: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: Using network provider: nm 32935 1726853738.18939: no more pending results, returning what we have 32935 1726853738.18942: results queue empty 32935 1726853738.18943: checking for any_errors_fatal 32935 1726853738.18949: done checking for any_errors_fatal 32935 1726853738.18949: checking for max_fail_percentage 32935 1726853738.18951: done checking for max_fail_percentage 32935 1726853738.18951: checking to see if all hosts have failed and the running result is not ok 32935 1726853738.18953: done checking to see if all hosts have failed 32935 1726853738.18953: getting the remaining hosts for this loop 32935 1726853738.18955: done getting the remaining hosts for this loop 32935 1726853738.18961: getting the next task for host managed_node1 32935 1726853738.18968: done getting next task for host managed_node1 32935 1726853738.18973: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 32935 1726853738.18976: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853738.18987: getting variables 32935 1726853738.18989: in VariableManager get_vars() 32935 1726853738.19029: Calling all_inventory to load vars for managed_node1 32935 1726853738.19032: Calling groups_inventory to load vars for managed_node1 32935 1726853738.19034: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853738.19044: Calling all_plugins_play to load vars for managed_node1 32935 1726853738.19047: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853738.19050: Calling groups_plugins_play to load vars for managed_node1 32935 1726853738.22203: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853738.24416: done with get_vars() 32935 1726853738.24446: done getting variables 32935 1726853738.24504: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 13:35:38 -0400 (0:00:00.091) 0:00:23.380 ****** 32935 1726853738.24535: entering _queue_task() for managed_node1/fail 32935 1726853738.24895: worker is 1 (out of 1 available) 32935 1726853738.24909: exiting _queue_task() for managed_node1/fail 32935 1726853738.24924: done queuing things up, now waiting for results queue to drain 32935 1726853738.24925: waiting for pending results... 32935 1726853738.25236: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 32935 1726853738.25398: in run() - task 02083763-bbaf-84df-441d-000000000067 32935 1726853738.25418: variable 'ansible_search_path' from source: unknown 32935 1726853738.25427: variable 'ansible_search_path' from source: unknown 32935 1726853738.25472: calling self._execute() 32935 1726853738.25588: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853738.25599: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853738.25620: variable 'omit' from source: magic vars 32935 1726853738.26339: variable 'ansible_distribution_major_version' from source: facts 32935 1726853738.26577: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853738.26645: variable 'network_state' from source: role '' defaults 32935 1726853738.26725: Evaluated conditional (network_state != {}): False 32935 1726853738.26732: when evaluation is False, skipping this task 32935 1726853738.26738: _execute() done 32935 1726853738.26743: dumping result to json 32935 1726853738.26748: done dumping result, returning 32935 1726853738.26757: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [02083763-bbaf-84df-441d-000000000067] 32935 1726853738.26784: sending task result for task 02083763-bbaf-84df-441d-000000000067 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 32935 1726853738.27135: no more pending results, returning what we have 32935 1726853738.27139: results queue empty 32935 1726853738.27140: checking for any_errors_fatal 32935 1726853738.27146: done checking for any_errors_fatal 32935 1726853738.27147: checking for max_fail_percentage 32935 1726853738.27149: done checking for max_fail_percentage 32935 1726853738.27149: checking to see if all hosts have failed and the running result is not ok 32935 1726853738.27150: done checking to see if all hosts have failed 32935 1726853738.27151: getting the remaining hosts for this loop 32935 1726853738.27153: done getting the remaining hosts for this loop 32935 1726853738.27156: getting the next task for host managed_node1 32935 1726853738.27166: done getting next task for host managed_node1 32935 1726853738.27173: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 32935 1726853738.27176: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853738.27197: getting variables 32935 1726853738.27199: in VariableManager get_vars() 32935 1726853738.27241: Calling all_inventory to load vars for managed_node1 32935 1726853738.27244: Calling groups_inventory to load vars for managed_node1 32935 1726853738.27246: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853738.27261: Calling all_plugins_play to load vars for managed_node1 32935 1726853738.27264: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853738.27267: Calling groups_plugins_play to load vars for managed_node1 32935 1726853738.28204: done sending task result for task 02083763-bbaf-84df-441d-000000000067 32935 1726853738.28207: WORKER PROCESS EXITING 32935 1726853738.28938: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853738.31454: done with get_vars() 32935 1726853738.31615: done getting variables 32935 1726853738.31730: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 13:35:38 -0400 (0:00:00.072) 0:00:23.453 ****** 32935 1726853738.31783: entering _queue_task() for managed_node1/fail 32935 1726853738.32187: worker is 1 (out of 1 available) 32935 1726853738.32208: exiting _queue_task() for managed_node1/fail 32935 1726853738.32221: done queuing things up, now waiting for results queue to drain 32935 1726853738.32222: waiting for pending results... 32935 1726853738.32466: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 32935 1726853738.32609: in run() - task 02083763-bbaf-84df-441d-000000000068 32935 1726853738.32626: variable 'ansible_search_path' from source: unknown 32935 1726853738.32632: variable 'ansible_search_path' from source: unknown 32935 1726853738.32673: calling self._execute() 32935 1726853738.32769: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853738.32782: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853738.32798: variable 'omit' from source: magic vars 32935 1726853738.33156: variable 'ansible_distribution_major_version' from source: facts 32935 1726853738.33178: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853738.33318: variable 'network_state' from source: role '' defaults 32935 1726853738.33333: Evaluated conditional (network_state != {}): False 32935 1726853738.33342: when evaluation is False, skipping this task 32935 1726853738.33350: _execute() done 32935 1726853738.33377: dumping result to json 32935 1726853738.33386: done dumping result, returning 32935 1726853738.33399: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [02083763-bbaf-84df-441d-000000000068] 32935 1726853738.33410: sending task result for task 02083763-bbaf-84df-441d-000000000068 32935 1726853738.33640: done sending task result for task 02083763-bbaf-84df-441d-000000000068 32935 1726853738.33645: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 32935 1726853738.33691: no more pending results, returning what we have 32935 1726853738.33695: results queue empty 32935 1726853738.33696: checking for any_errors_fatal 32935 1726853738.33702: done checking for any_errors_fatal 32935 1726853738.33703: checking for max_fail_percentage 32935 1726853738.33705: done checking for max_fail_percentage 32935 1726853738.33705: checking to see if all hosts have failed and the running result is not ok 32935 1726853738.33706: done checking to see if all hosts have failed 32935 1726853738.33707: getting the remaining hosts for this loop 32935 1726853738.33709: done getting the remaining hosts for this loop 32935 1726853738.33712: getting the next task for host managed_node1 32935 1726853738.33719: done getting next task for host managed_node1 32935 1726853738.33723: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 32935 1726853738.33726: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853738.33745: getting variables 32935 1726853738.33746: in VariableManager get_vars() 32935 1726853738.33787: Calling all_inventory to load vars for managed_node1 32935 1726853738.33790: Calling groups_inventory to load vars for managed_node1 32935 1726853738.33792: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853738.33801: Calling all_plugins_play to load vars for managed_node1 32935 1726853738.33803: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853738.33806: Calling groups_plugins_play to load vars for managed_node1 32935 1726853738.35901: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853738.37649: done with get_vars() 32935 1726853738.37678: done getting variables 32935 1726853738.37738: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 13:35:38 -0400 (0:00:00.060) 0:00:23.513 ****** 32935 1726853738.37777: entering _queue_task() for managed_node1/fail 32935 1726853738.38122: worker is 1 (out of 1 available) 32935 1726853738.38134: exiting _queue_task() for managed_node1/fail 32935 1726853738.38145: done queuing things up, now waiting for results queue to drain 32935 1726853738.38147: waiting for pending results... 32935 1726853738.38494: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 32935 1726853738.38829: in run() - task 02083763-bbaf-84df-441d-000000000069 32935 1726853738.38833: variable 'ansible_search_path' from source: unknown 32935 1726853738.38835: variable 'ansible_search_path' from source: unknown 32935 1726853738.38861: calling self._execute() 32935 1726853738.38980: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853738.38992: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853738.39010: variable 'omit' from source: magic vars 32935 1726853738.39421: variable 'ansible_distribution_major_version' from source: facts 32935 1726853738.39577: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853738.39620: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32935 1726853738.42020: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32935 1726853738.42099: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32935 1726853738.42136: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32935 1726853738.42175: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32935 1726853738.42208: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32935 1726853738.42292: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32935 1726853738.42342: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32935 1726853738.42408: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853738.42426: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32935 1726853738.42444: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32935 1726853738.42548: variable 'ansible_distribution_major_version' from source: facts 32935 1726853738.42575: Evaluated conditional (ansible_distribution_major_version | int > 9): True 32935 1726853738.42699: variable 'ansible_distribution' from source: facts 32935 1726853738.42734: variable '__network_rh_distros' from source: role '' defaults 32935 1726853738.42738: Evaluated conditional (ansible_distribution in __network_rh_distros): True 32935 1726853738.42990: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32935 1726853738.43018: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32935 1726853738.43046: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853738.43169: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32935 1726853738.43174: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32935 1726853738.43177: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32935 1726853738.43196: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32935 1726853738.43223: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853738.43268: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32935 1726853738.43291: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32935 1726853738.43334: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32935 1726853738.43364: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32935 1726853738.43399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853738.43440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32935 1726853738.43461: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32935 1726853738.43793: variable 'network_connections' from source: task vars 32935 1726853738.43809: variable 'interface' from source: play vars 32935 1726853738.43926: variable 'interface' from source: play vars 32935 1726853738.43930: variable 'vlan_interface' from source: play vars 32935 1726853738.43965: variable 'vlan_interface' from source: play vars 32935 1726853738.43980: variable 'network_state' from source: role '' defaults 32935 1726853738.44054: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 32935 1726853738.44236: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 32935 1726853738.44285: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 32935 1726853738.44319: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 32935 1726853738.44362: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 32935 1726853738.44466: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 32935 1726853738.44484: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 32935 1726853738.44487: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853738.44505: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 32935 1726853738.44536: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 32935 1726853738.44543: when evaluation is False, skipping this task 32935 1726853738.44550: _execute() done 32935 1726853738.44602: dumping result to json 32935 1726853738.44604: done dumping result, returning 32935 1726853738.44607: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [02083763-bbaf-84df-441d-000000000069] 32935 1726853738.44610: sending task result for task 02083763-bbaf-84df-441d-000000000069 skipping: [managed_node1] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 32935 1726853738.44821: no more pending results, returning what we have 32935 1726853738.44825: results queue empty 32935 1726853738.44826: checking for any_errors_fatal 32935 1726853738.44832: done checking for any_errors_fatal 32935 1726853738.44832: checking for max_fail_percentage 32935 1726853738.44834: done checking for max_fail_percentage 32935 1726853738.44835: checking to see if all hosts have failed and the running result is not ok 32935 1726853738.44836: done checking to see if all hosts have failed 32935 1726853738.44837: getting the remaining hosts for this loop 32935 1726853738.44839: done getting the remaining hosts for this loop 32935 1726853738.44842: getting the next task for host managed_node1 32935 1726853738.44851: done getting next task for host managed_node1 32935 1726853738.44854: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 32935 1726853738.44857: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853738.44879: getting variables 32935 1726853738.44881: in VariableManager get_vars() 32935 1726853738.44923: Calling all_inventory to load vars for managed_node1 32935 1726853738.44926: Calling groups_inventory to load vars for managed_node1 32935 1726853738.44929: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853738.44940: Calling all_plugins_play to load vars for managed_node1 32935 1726853738.44943: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853738.44946: Calling groups_plugins_play to load vars for managed_node1 32935 1726853738.45485: done sending task result for task 02083763-bbaf-84df-441d-000000000069 32935 1726853738.45488: WORKER PROCESS EXITING 32935 1726853738.46592: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853738.48265: done with get_vars() 32935 1726853738.48292: done getting variables 32935 1726853738.48357: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 13:35:38 -0400 (0:00:00.106) 0:00:23.619 ****** 32935 1726853738.48397: entering _queue_task() for managed_node1/dnf 32935 1726853738.48762: worker is 1 (out of 1 available) 32935 1726853738.48977: exiting _queue_task() for managed_node1/dnf 32935 1726853738.48989: done queuing things up, now waiting for results queue to drain 32935 1726853738.48991: waiting for pending results... 32935 1726853738.49087: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 32935 1726853738.49233: in run() - task 02083763-bbaf-84df-441d-00000000006a 32935 1726853738.49253: variable 'ansible_search_path' from source: unknown 32935 1726853738.49265: variable 'ansible_search_path' from source: unknown 32935 1726853738.49307: calling self._execute() 32935 1726853738.49416: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853738.49430: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853738.49449: variable 'omit' from source: magic vars 32935 1726853738.49875: variable 'ansible_distribution_major_version' from source: facts 32935 1726853738.49878: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853738.50031: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32935 1726853738.52338: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32935 1726853738.52419: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32935 1726853738.52463: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32935 1726853738.52506: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32935 1726853738.52535: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32935 1726853738.52686: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32935 1726853738.52690: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32935 1726853738.52704: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853738.52748: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32935 1726853738.52768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32935 1726853738.52898: variable 'ansible_distribution' from source: facts 32935 1726853738.52976: variable 'ansible_distribution_major_version' from source: facts 32935 1726853738.52979: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 32935 1726853738.53053: variable '__network_wireless_connections_defined' from source: role '' defaults 32935 1726853738.53195: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32935 1726853738.53224: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32935 1726853738.53255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853738.53302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32935 1726853738.53321: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32935 1726853738.53372: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32935 1726853738.53447: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32935 1726853738.53450: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853738.53477: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32935 1726853738.53497: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32935 1726853738.53535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32935 1726853738.53564: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32935 1726853738.53590: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853738.53623: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32935 1726853738.53636: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32935 1726853738.53785: variable 'network_connections' from source: task vars 32935 1726853738.53976: variable 'interface' from source: play vars 32935 1726853738.53980: variable 'interface' from source: play vars 32935 1726853738.53982: variable 'vlan_interface' from source: play vars 32935 1726853738.53984: variable 'vlan_interface' from source: play vars 32935 1726853738.54032: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 32935 1726853738.54217: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 32935 1726853738.54261: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 32935 1726853738.54300: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 32935 1726853738.54342: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 32935 1726853738.54393: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 32935 1726853738.54422: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 32935 1726853738.54466: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853738.54501: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 32935 1726853738.54562: variable '__network_team_connections_defined' from source: role '' defaults 32935 1726853738.54834: variable 'network_connections' from source: task vars 32935 1726853738.54845: variable 'interface' from source: play vars 32935 1726853738.54916: variable 'interface' from source: play vars 32935 1726853738.54928: variable 'vlan_interface' from source: play vars 32935 1726853738.54995: variable 'vlan_interface' from source: play vars 32935 1726853738.55023: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 32935 1726853738.55031: when evaluation is False, skipping this task 32935 1726853738.55085: _execute() done 32935 1726853738.55088: dumping result to json 32935 1726853738.55090: done dumping result, returning 32935 1726853738.55093: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [02083763-bbaf-84df-441d-00000000006a] 32935 1726853738.55095: sending task result for task 02083763-bbaf-84df-441d-00000000006a skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 32935 1726853738.55240: no more pending results, returning what we have 32935 1726853738.55243: results queue empty 32935 1726853738.55245: checking for any_errors_fatal 32935 1726853738.55253: done checking for any_errors_fatal 32935 1726853738.55254: checking for max_fail_percentage 32935 1726853738.55256: done checking for max_fail_percentage 32935 1726853738.55257: checking to see if all hosts have failed and the running result is not ok 32935 1726853738.55261: done checking to see if all hosts have failed 32935 1726853738.55261: getting the remaining hosts for this loop 32935 1726853738.55263: done getting the remaining hosts for this loop 32935 1726853738.55267: getting the next task for host managed_node1 32935 1726853738.55278: done getting next task for host managed_node1 32935 1726853738.55282: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 32935 1726853738.55285: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853738.55304: getting variables 32935 1726853738.55306: in VariableManager get_vars() 32935 1726853738.55350: Calling all_inventory to load vars for managed_node1 32935 1726853738.55353: Calling groups_inventory to load vars for managed_node1 32935 1726853738.55356: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853738.55370: Calling all_plugins_play to load vars for managed_node1 32935 1726853738.55478: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853738.55482: Calling groups_plugins_play to load vars for managed_node1 32935 1726853738.56184: done sending task result for task 02083763-bbaf-84df-441d-00000000006a 32935 1726853738.56187: WORKER PROCESS EXITING 32935 1726853738.57082: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853738.58695: done with get_vars() 32935 1726853738.58719: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 32935 1726853738.58797: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 13:35:38 -0400 (0:00:00.104) 0:00:23.723 ****** 32935 1726853738.58829: entering _queue_task() for managed_node1/yum 32935 1726853738.59221: worker is 1 (out of 1 available) 32935 1726853738.59235: exiting _queue_task() for managed_node1/yum 32935 1726853738.59248: done queuing things up, now waiting for results queue to drain 32935 1726853738.59250: waiting for pending results... 32935 1726853738.59509: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 32935 1726853738.59642: in run() - task 02083763-bbaf-84df-441d-00000000006b 32935 1726853738.59665: variable 'ansible_search_path' from source: unknown 32935 1726853738.59678: variable 'ansible_search_path' from source: unknown 32935 1726853738.59721: calling self._execute() 32935 1726853738.59821: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853738.59831: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853738.59845: variable 'omit' from source: magic vars 32935 1726853738.60251: variable 'ansible_distribution_major_version' from source: facts 32935 1726853738.60276: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853738.60454: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32935 1726853738.62721: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32935 1726853738.62799: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32935 1726853738.62844: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32935 1726853738.62889: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32935 1726853738.62921: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32935 1726853738.63008: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32935 1726853738.63575: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32935 1726853738.63578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853738.63581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32935 1726853738.63583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32935 1726853738.63585: variable 'ansible_distribution_major_version' from source: facts 32935 1726853738.63607: Evaluated conditional (ansible_distribution_major_version | int < 8): False 32935 1726853738.63614: when evaluation is False, skipping this task 32935 1726853738.63620: _execute() done 32935 1726853738.63627: dumping result to json 32935 1726853738.63634: done dumping result, returning 32935 1726853738.63645: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [02083763-bbaf-84df-441d-00000000006b] 32935 1726853738.63653: sending task result for task 02083763-bbaf-84df-441d-00000000006b skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 32935 1726853738.63808: no more pending results, returning what we have 32935 1726853738.63812: results queue empty 32935 1726853738.63814: checking for any_errors_fatal 32935 1726853738.63819: done checking for any_errors_fatal 32935 1726853738.63820: checking for max_fail_percentage 32935 1726853738.63822: done checking for max_fail_percentage 32935 1726853738.63822: checking to see if all hosts have failed and the running result is not ok 32935 1726853738.63824: done checking to see if all hosts have failed 32935 1726853738.63825: getting the remaining hosts for this loop 32935 1726853738.63827: done getting the remaining hosts for this loop 32935 1726853738.63830: getting the next task for host managed_node1 32935 1726853738.63839: done getting next task for host managed_node1 32935 1726853738.63843: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 32935 1726853738.63846: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853738.63869: getting variables 32935 1726853738.63873: in VariableManager get_vars() 32935 1726853738.63915: Calling all_inventory to load vars for managed_node1 32935 1726853738.63918: Calling groups_inventory to load vars for managed_node1 32935 1726853738.63921: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853738.63931: Calling all_plugins_play to load vars for managed_node1 32935 1726853738.63934: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853738.63937: Calling groups_plugins_play to load vars for managed_node1 32935 1726853738.64784: done sending task result for task 02083763-bbaf-84df-441d-00000000006b 32935 1726853738.64787: WORKER PROCESS EXITING 32935 1726853738.65787: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853738.67376: done with get_vars() 32935 1726853738.67405: done getting variables 32935 1726853738.67468: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 13:35:38 -0400 (0:00:00.086) 0:00:23.810 ****** 32935 1726853738.67503: entering _queue_task() for managed_node1/fail 32935 1726853738.67866: worker is 1 (out of 1 available) 32935 1726853738.68082: exiting _queue_task() for managed_node1/fail 32935 1726853738.68095: done queuing things up, now waiting for results queue to drain 32935 1726853738.68096: waiting for pending results... 32935 1726853738.68224: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 32935 1726853738.68383: in run() - task 02083763-bbaf-84df-441d-00000000006c 32935 1726853738.68403: variable 'ansible_search_path' from source: unknown 32935 1726853738.68410: variable 'ansible_search_path' from source: unknown 32935 1726853738.68460: calling self._execute() 32935 1726853738.68583: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853738.68595: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853738.68612: variable 'omit' from source: magic vars 32935 1726853738.69025: variable 'ansible_distribution_major_version' from source: facts 32935 1726853738.69100: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853738.69165: variable '__network_wireless_connections_defined' from source: role '' defaults 32935 1726853738.69354: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32935 1726853738.71480: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32935 1726853738.71555: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32935 1726853738.71605: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32935 1726853738.71642: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32935 1726853738.71677: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32935 1726853738.71766: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32935 1726853738.71912: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32935 1726853738.71915: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853738.71918: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32935 1726853738.71920: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32935 1726853738.71966: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32935 1726853738.71996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32935 1726853738.72029: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853738.72078: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32935 1726853738.72098: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32935 1726853738.72149: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32935 1726853738.72183: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32935 1726853738.72212: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853738.72265: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32935 1726853738.72287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32935 1726853738.72489: variable 'network_connections' from source: task vars 32935 1726853738.72508: variable 'interface' from source: play vars 32935 1726853738.72590: variable 'interface' from source: play vars 32935 1726853738.72605: variable 'vlan_interface' from source: play vars 32935 1726853738.72777: variable 'vlan_interface' from source: play vars 32935 1726853738.72783: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 32935 1726853738.72942: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 32935 1726853738.72990: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 32935 1726853738.73031: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 32935 1726853738.73074: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 32935 1726853738.73129: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 32935 1726853738.73161: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 32935 1726853738.73196: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853738.73234: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 32935 1726853738.73299: variable '__network_team_connections_defined' from source: role '' defaults 32935 1726853738.73567: variable 'network_connections' from source: task vars 32935 1726853738.73663: variable 'interface' from source: play vars 32935 1726853738.73666: variable 'interface' from source: play vars 32935 1726853738.73669: variable 'vlan_interface' from source: play vars 32935 1726853738.73716: variable 'vlan_interface' from source: play vars 32935 1726853738.73743: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 32935 1726853738.73751: when evaluation is False, skipping this task 32935 1726853738.73757: _execute() done 32935 1726853738.73773: dumping result to json 32935 1726853738.73781: done dumping result, returning 32935 1726853738.73793: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [02083763-bbaf-84df-441d-00000000006c] 32935 1726853738.73810: sending task result for task 02083763-bbaf-84df-441d-00000000006c 32935 1726853738.74017: done sending task result for task 02083763-bbaf-84df-441d-00000000006c 32935 1726853738.74020: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 32935 1726853738.74075: no more pending results, returning what we have 32935 1726853738.74079: results queue empty 32935 1726853738.74081: checking for any_errors_fatal 32935 1726853738.74086: done checking for any_errors_fatal 32935 1726853738.74086: checking for max_fail_percentage 32935 1726853738.74088: done checking for max_fail_percentage 32935 1726853738.74089: checking to see if all hosts have failed and the running result is not ok 32935 1726853738.74091: done checking to see if all hosts have failed 32935 1726853738.74091: getting the remaining hosts for this loop 32935 1726853738.74093: done getting the remaining hosts for this loop 32935 1726853738.74097: getting the next task for host managed_node1 32935 1726853738.74104: done getting next task for host managed_node1 32935 1726853738.74108: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 32935 1726853738.74111: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853738.74130: getting variables 32935 1726853738.74131: in VariableManager get_vars() 32935 1726853738.74177: Calling all_inventory to load vars for managed_node1 32935 1726853738.74180: Calling groups_inventory to load vars for managed_node1 32935 1726853738.74183: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853738.74194: Calling all_plugins_play to load vars for managed_node1 32935 1726853738.74197: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853738.74200: Calling groups_plugins_play to load vars for managed_node1 32935 1726853738.75752: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853738.77349: done with get_vars() 32935 1726853738.77386: done getting variables 32935 1726853738.77449: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 13:35:38 -0400 (0:00:00.099) 0:00:23.910 ****** 32935 1726853738.77491: entering _queue_task() for managed_node1/package 32935 1726853738.77979: worker is 1 (out of 1 available) 32935 1726853738.77992: exiting _queue_task() for managed_node1/package 32935 1726853738.78003: done queuing things up, now waiting for results queue to drain 32935 1726853738.78004: waiting for pending results... 32935 1726853738.78209: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 32935 1726853738.78379: in run() - task 02083763-bbaf-84df-441d-00000000006d 32935 1726853738.78444: variable 'ansible_search_path' from source: unknown 32935 1726853738.78448: variable 'ansible_search_path' from source: unknown 32935 1726853738.78451: calling self._execute() 32935 1726853738.78544: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853738.78556: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853738.78584: variable 'omit' from source: magic vars 32935 1726853738.78884: variable 'ansible_distribution_major_version' from source: facts 32935 1726853738.78893: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853738.79026: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 32935 1726853738.79222: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 32935 1726853738.79257: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 32935 1726853738.79285: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 32935 1726853738.79339: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 32935 1726853738.79420: variable 'network_packages' from source: role '' defaults 32935 1726853738.79495: variable '__network_provider_setup' from source: role '' defaults 32935 1726853738.79503: variable '__network_service_name_default_nm' from source: role '' defaults 32935 1726853738.79549: variable '__network_service_name_default_nm' from source: role '' defaults 32935 1726853738.79556: variable '__network_packages_default_nm' from source: role '' defaults 32935 1726853738.79603: variable '__network_packages_default_nm' from source: role '' defaults 32935 1726853738.79720: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32935 1726853738.81717: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32935 1726853738.81769: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32935 1726853738.81797: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32935 1726853738.81820: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32935 1726853738.81841: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32935 1726853738.81901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32935 1726853738.81921: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32935 1726853738.81939: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853738.81968: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32935 1726853738.81979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32935 1726853738.82010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32935 1726853738.82026: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32935 1726853738.82042: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853738.82070: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32935 1726853738.82083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32935 1726853738.82228: variable '__network_packages_default_gobject_packages' from source: role '' defaults 32935 1726853738.82308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32935 1726853738.82323: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32935 1726853738.82339: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853738.82364: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32935 1726853738.82377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32935 1726853738.82440: variable 'ansible_python' from source: facts 32935 1726853738.82462: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 32935 1726853738.82519: variable '__network_wpa_supplicant_required' from source: role '' defaults 32935 1726853738.82573: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 32935 1726853738.82656: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32935 1726853738.82675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32935 1726853738.82692: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853738.82717: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32935 1726853738.82732: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32935 1726853738.82762: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32935 1726853738.82781: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32935 1726853738.82797: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853738.82823: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32935 1726853738.82834: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32935 1726853738.82929: variable 'network_connections' from source: task vars 32935 1726853738.82935: variable 'interface' from source: play vars 32935 1726853738.83007: variable 'interface' from source: play vars 32935 1726853738.83015: variable 'vlan_interface' from source: play vars 32935 1726853738.83087: variable 'vlan_interface' from source: play vars 32935 1726853738.83135: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 32935 1726853738.83153: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 32935 1726853738.83179: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853738.83199: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 32935 1726853738.83236: variable '__network_wireless_connections_defined' from source: role '' defaults 32935 1726853738.83426: variable 'network_connections' from source: task vars 32935 1726853738.83430: variable 'interface' from source: play vars 32935 1726853738.83676: variable 'interface' from source: play vars 32935 1726853738.83679: variable 'vlan_interface' from source: play vars 32935 1726853738.83681: variable 'vlan_interface' from source: play vars 32935 1726853738.83684: variable '__network_packages_default_wireless' from source: role '' defaults 32935 1726853738.83730: variable '__network_wireless_connections_defined' from source: role '' defaults 32935 1726853738.84040: variable 'network_connections' from source: task vars 32935 1726853738.84051: variable 'interface' from source: play vars 32935 1726853738.84123: variable 'interface' from source: play vars 32935 1726853738.84143: variable 'vlan_interface' from source: play vars 32935 1726853738.84220: variable 'vlan_interface' from source: play vars 32935 1726853738.84248: variable '__network_packages_default_team' from source: role '' defaults 32935 1726853738.84332: variable '__network_team_connections_defined' from source: role '' defaults 32935 1726853738.84615: variable 'network_connections' from source: task vars 32935 1726853738.84624: variable 'interface' from source: play vars 32935 1726853738.84689: variable 'interface' from source: play vars 32935 1726853738.84701: variable 'vlan_interface' from source: play vars 32935 1726853738.84765: variable 'vlan_interface' from source: play vars 32935 1726853738.84822: variable '__network_service_name_default_initscripts' from source: role '' defaults 32935 1726853738.84883: variable '__network_service_name_default_initscripts' from source: role '' defaults 32935 1726853738.84896: variable '__network_packages_default_initscripts' from source: role '' defaults 32935 1726853738.84960: variable '__network_packages_default_initscripts' from source: role '' defaults 32935 1726853738.85131: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 32935 1726853738.85429: variable 'network_connections' from source: task vars 32935 1726853738.85433: variable 'interface' from source: play vars 32935 1726853738.85479: variable 'interface' from source: play vars 32935 1726853738.85486: variable 'vlan_interface' from source: play vars 32935 1726853738.85525: variable 'vlan_interface' from source: play vars 32935 1726853738.85532: variable 'ansible_distribution' from source: facts 32935 1726853738.85534: variable '__network_rh_distros' from source: role '' defaults 32935 1726853738.85540: variable 'ansible_distribution_major_version' from source: facts 32935 1726853738.85554: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 32935 1726853738.85660: variable 'ansible_distribution' from source: facts 32935 1726853738.85663: variable '__network_rh_distros' from source: role '' defaults 32935 1726853738.85667: variable 'ansible_distribution_major_version' from source: facts 32935 1726853738.85679: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 32935 1726853738.85776: variable 'ansible_distribution' from source: facts 32935 1726853738.85782: variable '__network_rh_distros' from source: role '' defaults 32935 1726853738.85784: variable 'ansible_distribution_major_version' from source: facts 32935 1726853738.85812: variable 'network_provider' from source: set_fact 32935 1726853738.85823: variable 'ansible_facts' from source: unknown 32935 1726853738.86194: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 32935 1726853738.86197: when evaluation is False, skipping this task 32935 1726853738.86200: _execute() done 32935 1726853738.86202: dumping result to json 32935 1726853738.86204: done dumping result, returning 32935 1726853738.86212: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [02083763-bbaf-84df-441d-00000000006d] 32935 1726853738.86214: sending task result for task 02083763-bbaf-84df-441d-00000000006d 32935 1726853738.86310: done sending task result for task 02083763-bbaf-84df-441d-00000000006d 32935 1726853738.86313: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 32935 1726853738.86377: no more pending results, returning what we have 32935 1726853738.86381: results queue empty 32935 1726853738.86382: checking for any_errors_fatal 32935 1726853738.86387: done checking for any_errors_fatal 32935 1726853738.86388: checking for max_fail_percentage 32935 1726853738.86389: done checking for max_fail_percentage 32935 1726853738.86390: checking to see if all hosts have failed and the running result is not ok 32935 1726853738.86391: done checking to see if all hosts have failed 32935 1726853738.86392: getting the remaining hosts for this loop 32935 1726853738.86393: done getting the remaining hosts for this loop 32935 1726853738.86397: getting the next task for host managed_node1 32935 1726853738.86406: done getting next task for host managed_node1 32935 1726853738.86409: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 32935 1726853738.86412: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853738.86433: getting variables 32935 1726853738.86435: in VariableManager get_vars() 32935 1726853738.86477: Calling all_inventory to load vars for managed_node1 32935 1726853738.86480: Calling groups_inventory to load vars for managed_node1 32935 1726853738.86482: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853738.86491: Calling all_plugins_play to load vars for managed_node1 32935 1726853738.86493: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853738.86496: Calling groups_plugins_play to load vars for managed_node1 32935 1726853738.87717: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853738.89467: done with get_vars() 32935 1726853738.89493: done getting variables 32935 1726853738.89554: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 13:35:38 -0400 (0:00:00.120) 0:00:24.031 ****** 32935 1726853738.89588: entering _queue_task() for managed_node1/package 32935 1726853738.89860: worker is 1 (out of 1 available) 32935 1726853738.89877: exiting _queue_task() for managed_node1/package 32935 1726853738.89890: done queuing things up, now waiting for results queue to drain 32935 1726853738.89892: waiting for pending results... 32935 1726853738.90089: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 32935 1726853738.90187: in run() - task 02083763-bbaf-84df-441d-00000000006e 32935 1726853738.90197: variable 'ansible_search_path' from source: unknown 32935 1726853738.90201: variable 'ansible_search_path' from source: unknown 32935 1726853738.90230: calling self._execute() 32935 1726853738.90313: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853738.90317: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853738.90327: variable 'omit' from source: magic vars 32935 1726853738.90608: variable 'ansible_distribution_major_version' from source: facts 32935 1726853738.90618: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853738.90702: variable 'network_state' from source: role '' defaults 32935 1726853738.90710: Evaluated conditional (network_state != {}): False 32935 1726853738.90713: when evaluation is False, skipping this task 32935 1726853738.90717: _execute() done 32935 1726853738.90721: dumping result to json 32935 1726853738.90724: done dumping result, returning 32935 1726853738.90733: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [02083763-bbaf-84df-441d-00000000006e] 32935 1726853738.90736: sending task result for task 02083763-bbaf-84df-441d-00000000006e skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 32935 1726853738.91017: no more pending results, returning what we have 32935 1726853738.91020: results queue empty 32935 1726853738.91021: checking for any_errors_fatal 32935 1726853738.91026: done checking for any_errors_fatal 32935 1726853738.91026: checking for max_fail_percentage 32935 1726853738.91028: done checking for max_fail_percentage 32935 1726853738.91029: checking to see if all hosts have failed and the running result is not ok 32935 1726853738.91030: done checking to see if all hosts have failed 32935 1726853738.91030: getting the remaining hosts for this loop 32935 1726853738.91031: done getting the remaining hosts for this loop 32935 1726853738.91034: getting the next task for host managed_node1 32935 1726853738.91041: done getting next task for host managed_node1 32935 1726853738.91044: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 32935 1726853738.91047: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853738.91063: getting variables 32935 1726853738.91065: in VariableManager get_vars() 32935 1726853738.91102: Calling all_inventory to load vars for managed_node1 32935 1726853738.91105: Calling groups_inventory to load vars for managed_node1 32935 1726853738.91108: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853738.91117: Calling all_plugins_play to load vars for managed_node1 32935 1726853738.91119: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853738.91122: Calling groups_plugins_play to load vars for managed_node1 32935 1726853738.91684: done sending task result for task 02083763-bbaf-84df-441d-00000000006e 32935 1726853738.91688: WORKER PROCESS EXITING 32935 1726853738.92818: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853738.94545: done with get_vars() 32935 1726853738.94569: done getting variables 32935 1726853738.94626: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 13:35:38 -0400 (0:00:00.050) 0:00:24.082 ****** 32935 1726853738.94662: entering _queue_task() for managed_node1/package 32935 1726853738.94972: worker is 1 (out of 1 available) 32935 1726853738.94986: exiting _queue_task() for managed_node1/package 32935 1726853738.94997: done queuing things up, now waiting for results queue to drain 32935 1726853738.94998: waiting for pending results... 32935 1726853738.95244: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 32935 1726853738.95400: in run() - task 02083763-bbaf-84df-441d-00000000006f 32935 1726853738.95420: variable 'ansible_search_path' from source: unknown 32935 1726853738.95428: variable 'ansible_search_path' from source: unknown 32935 1726853738.95500: calling self._execute() 32935 1726853738.95584: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853738.95595: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853738.95613: variable 'omit' from source: magic vars 32935 1726853738.96000: variable 'ansible_distribution_major_version' from source: facts 32935 1726853738.96056: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853738.96154: variable 'network_state' from source: role '' defaults 32935 1726853738.96181: Evaluated conditional (network_state != {}): False 32935 1726853738.96211: when evaluation is False, skipping this task 32935 1726853738.96219: _execute() done 32935 1726853738.96277: dumping result to json 32935 1726853738.96280: done dumping result, returning 32935 1726853738.96283: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [02083763-bbaf-84df-441d-00000000006f] 32935 1726853738.96286: sending task result for task 02083763-bbaf-84df-441d-00000000006f 32935 1726853738.96497: done sending task result for task 02083763-bbaf-84df-441d-00000000006f 32935 1726853738.96500: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 32935 1726853738.96550: no more pending results, returning what we have 32935 1726853738.96554: results queue empty 32935 1726853738.96556: checking for any_errors_fatal 32935 1726853738.96580: done checking for any_errors_fatal 32935 1726853738.96582: checking for max_fail_percentage 32935 1726853738.96585: done checking for max_fail_percentage 32935 1726853738.96586: checking to see if all hosts have failed and the running result is not ok 32935 1726853738.96587: done checking to see if all hosts have failed 32935 1726853738.96588: getting the remaining hosts for this loop 32935 1726853738.96590: done getting the remaining hosts for this loop 32935 1726853738.96594: getting the next task for host managed_node1 32935 1726853738.96604: done getting next task for host managed_node1 32935 1726853738.96608: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 32935 1726853738.96612: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853738.96633: getting variables 32935 1726853738.96636: in VariableManager get_vars() 32935 1726853738.96908: Calling all_inventory to load vars for managed_node1 32935 1726853738.96910: Calling groups_inventory to load vars for managed_node1 32935 1726853738.96913: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853738.96922: Calling all_plugins_play to load vars for managed_node1 32935 1726853738.96925: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853738.96928: Calling groups_plugins_play to load vars for managed_node1 32935 1726853738.98431: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853739.00047: done with get_vars() 32935 1726853739.00078: done getting variables 32935 1726853739.00138: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 13:35:39 -0400 (0:00:00.055) 0:00:24.137 ****** 32935 1726853739.00178: entering _queue_task() for managed_node1/service 32935 1726853739.00611: worker is 1 (out of 1 available) 32935 1726853739.00623: exiting _queue_task() for managed_node1/service 32935 1726853739.00634: done queuing things up, now waiting for results queue to drain 32935 1726853739.00636: waiting for pending results... 32935 1726853739.00880: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 32935 1726853739.00998: in run() - task 02083763-bbaf-84df-441d-000000000070 32935 1726853739.01176: variable 'ansible_search_path' from source: unknown 32935 1726853739.01180: variable 'ansible_search_path' from source: unknown 32935 1726853739.01183: calling self._execute() 32935 1726853739.01185: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853739.01188: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853739.01190: variable 'omit' from source: magic vars 32935 1726853739.01583: variable 'ansible_distribution_major_version' from source: facts 32935 1726853739.01600: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853739.01726: variable '__network_wireless_connections_defined' from source: role '' defaults 32935 1726853739.01931: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32935 1726853739.04223: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32935 1726853739.04302: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32935 1726853739.04342: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32935 1726853739.04389: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32935 1726853739.04419: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32935 1726853739.04508: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32935 1726853739.04554: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32935 1726853739.04595: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853739.04640: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32935 1726853739.04663: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32935 1726853739.04718: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32935 1726853739.04786: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32935 1726853739.04789: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853739.04823: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32935 1726853739.04842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32935 1726853739.04896: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32935 1726853739.04924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32935 1726853739.04954: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853739.05109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32935 1726853739.05113: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32935 1726853739.05211: variable 'network_connections' from source: task vars 32935 1726853739.05236: variable 'interface' from source: play vars 32935 1726853739.05313: variable 'interface' from source: play vars 32935 1726853739.05331: variable 'vlan_interface' from source: play vars 32935 1726853739.05403: variable 'vlan_interface' from source: play vars 32935 1726853739.05484: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 32935 1726853739.05651: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 32935 1726853739.05699: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 32935 1726853739.05731: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 32935 1726853739.05764: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 32935 1726853739.05813: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 32935 1726853739.05838: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 32935 1726853739.05884: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853739.05902: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 32935 1726853739.05993: variable '__network_team_connections_defined' from source: role '' defaults 32935 1726853739.06216: variable 'network_connections' from source: task vars 32935 1726853739.06226: variable 'interface' from source: play vars 32935 1726853739.06293: variable 'interface' from source: play vars 32935 1726853739.06304: variable 'vlan_interface' from source: play vars 32935 1726853739.06372: variable 'vlan_interface' from source: play vars 32935 1726853739.06400: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 32935 1726853739.06426: when evaluation is False, skipping this task 32935 1726853739.06429: _execute() done 32935 1726853739.06431: dumping result to json 32935 1726853739.06433: done dumping result, returning 32935 1726853739.06438: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [02083763-bbaf-84df-441d-000000000070] 32935 1726853739.06536: sending task result for task 02083763-bbaf-84df-441d-000000000070 32935 1726853739.06607: done sending task result for task 02083763-bbaf-84df-441d-000000000070 32935 1726853739.06610: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 32935 1726853739.06690: no more pending results, returning what we have 32935 1726853739.06693: results queue empty 32935 1726853739.06694: checking for any_errors_fatal 32935 1726853739.06701: done checking for any_errors_fatal 32935 1726853739.06701: checking for max_fail_percentage 32935 1726853739.06704: done checking for max_fail_percentage 32935 1726853739.06705: checking to see if all hosts have failed and the running result is not ok 32935 1726853739.06706: done checking to see if all hosts have failed 32935 1726853739.06707: getting the remaining hosts for this loop 32935 1726853739.06708: done getting the remaining hosts for this loop 32935 1726853739.06712: getting the next task for host managed_node1 32935 1726853739.06720: done getting next task for host managed_node1 32935 1726853739.06725: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 32935 1726853739.06727: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853739.06745: getting variables 32935 1726853739.06747: in VariableManager get_vars() 32935 1726853739.06793: Calling all_inventory to load vars for managed_node1 32935 1726853739.06796: Calling groups_inventory to load vars for managed_node1 32935 1726853739.06799: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853739.06810: Calling all_plugins_play to load vars for managed_node1 32935 1726853739.06813: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853739.06816: Calling groups_plugins_play to load vars for managed_node1 32935 1726853739.08568: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853739.10312: done with get_vars() 32935 1726853739.10347: done getting variables 32935 1726853739.10412: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 13:35:39 -0400 (0:00:00.102) 0:00:24.240 ****** 32935 1726853739.10443: entering _queue_task() for managed_node1/service 32935 1726853739.11378: worker is 1 (out of 1 available) 32935 1726853739.11390: exiting _queue_task() for managed_node1/service 32935 1726853739.11402: done queuing things up, now waiting for results queue to drain 32935 1726853739.11404: waiting for pending results... 32935 1726853739.12091: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 32935 1726853739.12290: in run() - task 02083763-bbaf-84df-441d-000000000071 32935 1726853739.12309: variable 'ansible_search_path' from source: unknown 32935 1726853739.12316: variable 'ansible_search_path' from source: unknown 32935 1726853739.12364: calling self._execute() 32935 1726853739.12778: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853739.12783: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853739.12786: variable 'omit' from source: magic vars 32935 1726853739.13580: variable 'ansible_distribution_major_version' from source: facts 32935 1726853739.13583: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853739.13904: variable 'network_provider' from source: set_fact 32935 1726853739.13908: variable 'network_state' from source: role '' defaults 32935 1726853739.13910: Evaluated conditional (network_provider == "nm" or network_state != {}): True 32935 1726853739.13914: variable 'omit' from source: magic vars 32935 1726853739.13978: variable 'omit' from source: magic vars 32935 1726853739.14042: variable 'network_service_name' from source: role '' defaults 32935 1726853739.14260: variable 'network_service_name' from source: role '' defaults 32935 1726853739.14490: variable '__network_provider_setup' from source: role '' defaults 32935 1726853739.14502: variable '__network_service_name_default_nm' from source: role '' defaults 32935 1726853739.14778: variable '__network_service_name_default_nm' from source: role '' defaults 32935 1726853739.14782: variable '__network_packages_default_nm' from source: role '' defaults 32935 1726853739.14784: variable '__network_packages_default_nm' from source: role '' defaults 32935 1726853739.15100: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32935 1726853739.19402: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32935 1726853739.19653: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32935 1726853739.19700: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32935 1726853739.19752: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32935 1726853739.19816: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32935 1726853739.20014: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32935 1726853739.20111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32935 1726853739.20179: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853739.20228: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32935 1726853739.20249: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32935 1726853739.20302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32935 1726853739.20337: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32935 1726853739.20363: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853739.20402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32935 1726853739.20422: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32935 1726853739.20674: variable '__network_packages_default_gobject_packages' from source: role '' defaults 32935 1726853739.20808: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32935 1726853739.20877: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32935 1726853739.20880: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853739.20919: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32935 1726853739.20966: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32935 1726853739.21087: variable 'ansible_python' from source: facts 32935 1726853739.21118: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 32935 1726853739.21228: variable '__network_wpa_supplicant_required' from source: role '' defaults 32935 1726853739.21322: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 32935 1726853739.21457: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32935 1726853739.21493: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32935 1726853739.21528: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853739.21589: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32935 1726853739.21613: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32935 1726853739.21721: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32935 1726853739.21733: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32935 1726853739.21736: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853739.21778: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32935 1726853739.21806: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32935 1726853739.21997: variable 'network_connections' from source: task vars 32935 1726853739.22016: variable 'interface' from source: play vars 32935 1726853739.22579: variable 'interface' from source: play vars 32935 1726853739.22582: variable 'vlan_interface' from source: play vars 32935 1726853739.22644: variable 'vlan_interface' from source: play vars 32935 1726853739.22907: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 32935 1726853739.23336: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 32935 1726853739.23418: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 32935 1726853739.23473: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 32935 1726853739.23563: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 32935 1726853739.23712: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 32935 1726853739.23747: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 32935 1726853739.23876: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853739.23881: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 32935 1726853739.23889: variable '__network_wireless_connections_defined' from source: role '' defaults 32935 1726853739.24148: variable 'network_connections' from source: task vars 32935 1726853739.24161: variable 'interface' from source: play vars 32935 1726853739.24243: variable 'interface' from source: play vars 32935 1726853739.24261: variable 'vlan_interface' from source: play vars 32935 1726853739.24340: variable 'vlan_interface' from source: play vars 32935 1726853739.24381: variable '__network_packages_default_wireless' from source: role '' defaults 32935 1726853739.24465: variable '__network_wireless_connections_defined' from source: role '' defaults 32935 1726853739.24745: variable 'network_connections' from source: task vars 32935 1726853739.24762: variable 'interface' from source: play vars 32935 1726853739.24836: variable 'interface' from source: play vars 32935 1726853739.24848: variable 'vlan_interface' from source: play vars 32935 1726853739.24924: variable 'vlan_interface' from source: play vars 32935 1726853739.24978: variable '__network_packages_default_team' from source: role '' defaults 32935 1726853739.25037: variable '__network_team_connections_defined' from source: role '' defaults 32935 1726853739.25342: variable 'network_connections' from source: task vars 32935 1726853739.25352: variable 'interface' from source: play vars 32935 1726853739.25429: variable 'interface' from source: play vars 32935 1726853739.25477: variable 'vlan_interface' from source: play vars 32935 1726853739.25524: variable 'vlan_interface' from source: play vars 32935 1726853739.25585: variable '__network_service_name_default_initscripts' from source: role '' defaults 32935 1726853739.25650: variable '__network_service_name_default_initscripts' from source: role '' defaults 32935 1726853739.25664: variable '__network_packages_default_initscripts' from source: role '' defaults 32935 1726853739.25727: variable '__network_packages_default_initscripts' from source: role '' defaults 32935 1726853739.26067: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 32935 1726853739.26472: variable 'network_connections' from source: task vars 32935 1726853739.26483: variable 'interface' from source: play vars 32935 1726853739.26547: variable 'interface' from source: play vars 32935 1726853739.26562: variable 'vlan_interface' from source: play vars 32935 1726853739.26628: variable 'vlan_interface' from source: play vars 32935 1726853739.26642: variable 'ansible_distribution' from source: facts 32935 1726853739.26649: variable '__network_rh_distros' from source: role '' defaults 32935 1726853739.26660: variable 'ansible_distribution_major_version' from source: facts 32935 1726853739.26683: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 32935 1726853739.26864: variable 'ansible_distribution' from source: facts 32935 1726853739.26876: variable '__network_rh_distros' from source: role '' defaults 32935 1726853739.26885: variable 'ansible_distribution_major_version' from source: facts 32935 1726853739.26901: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 32935 1726853739.27084: variable 'ansible_distribution' from source: facts 32935 1726853739.27092: variable '__network_rh_distros' from source: role '' defaults 32935 1726853739.27103: variable 'ansible_distribution_major_version' from source: facts 32935 1726853739.27141: variable 'network_provider' from source: set_fact 32935 1726853739.27178: variable 'omit' from source: magic vars 32935 1726853739.27211: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32935 1726853739.27242: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32935 1726853739.27274: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32935 1726853739.27369: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853739.27374: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853739.27376: variable 'inventory_hostname' from source: host vars for 'managed_node1' 32935 1726853739.27378: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853739.27380: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853739.27455: Set connection var ansible_timeout to 10 32935 1726853739.27469: Set connection var ansible_shell_type to sh 32935 1726853739.27485: Set connection var ansible_pipelining to False 32935 1726853739.27491: Set connection var ansible_connection to ssh 32935 1726853739.27500: Set connection var ansible_shell_executable to /bin/sh 32935 1726853739.27508: Set connection var ansible_module_compression to ZIP_DEFLATED 32935 1726853739.27536: variable 'ansible_shell_executable' from source: unknown 32935 1726853739.27543: variable 'ansible_connection' from source: unknown 32935 1726853739.27549: variable 'ansible_module_compression' from source: unknown 32935 1726853739.27555: variable 'ansible_shell_type' from source: unknown 32935 1726853739.27564: variable 'ansible_shell_executable' from source: unknown 32935 1726853739.27577: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853739.27590: variable 'ansible_pipelining' from source: unknown 32935 1726853739.27675: variable 'ansible_timeout' from source: unknown 32935 1726853739.27679: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853739.27715: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32935 1726853739.27730: variable 'omit' from source: magic vars 32935 1726853739.27739: starting attempt loop 32935 1726853739.27745: running the handler 32935 1726853739.27832: variable 'ansible_facts' from source: unknown 32935 1726853739.28636: _low_level_execute_command(): starting 32935 1726853739.28650: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32935 1726853739.29595: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32935 1726853739.29600: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853739.29728: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853739.29801: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853739.31527: stdout chunk (state=3): >>>/root <<< 32935 1726853739.31607: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853739.31745: stderr chunk (state=3): >>><<< 32935 1726853739.31748: stdout chunk (state=3): >>><<< 32935 1726853739.31998: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853739.32002: _low_level_execute_command(): starting 32935 1726853739.32006: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853739.3178186-34081-159530588179398 `" && echo ansible-tmp-1726853739.3178186-34081-159530588179398="` echo /root/.ansible/tmp/ansible-tmp-1726853739.3178186-34081-159530588179398 `" ) && sleep 0' 32935 1726853739.33019: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32935 1726853739.33088: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853739.33101: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853739.33116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32935 1726853739.33128: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 32935 1726853739.33141: stderr chunk (state=3): >>>debug2: match not found <<< 32935 1726853739.33365: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853739.33382: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853739.33455: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853739.35388: stdout chunk (state=3): >>>ansible-tmp-1726853739.3178186-34081-159530588179398=/root/.ansible/tmp/ansible-tmp-1726853739.3178186-34081-159530588179398 <<< 32935 1726853739.35664: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853739.35668: stdout chunk (state=3): >>><<< 32935 1726853739.35711: stderr chunk (state=3): >>><<< 32935 1726853739.35715: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853739.3178186-34081-159530588179398=/root/.ansible/tmp/ansible-tmp-1726853739.3178186-34081-159530588179398 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853739.35727: variable 'ansible_module_compression' from source: unknown 32935 1726853739.35781: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-32935vj31k4ae/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 32935 1726853739.35843: variable 'ansible_facts' from source: unknown 32935 1726853739.36256: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853739.3178186-34081-159530588179398/AnsiballZ_systemd.py 32935 1726853739.36910: Sending initial data 32935 1726853739.36913: Sent initial data (156 bytes) 32935 1726853739.37944: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853739.38143: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32935 1726853739.38147: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853739.38153: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853739.38156: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853739.38185: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853739.38311: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853739.38384: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853739.40013: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32935 1726853739.40070: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32935 1726853739.40096: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32935vj31k4ae/tmpuc81z4g_ /root/.ansible/tmp/ansible-tmp-1726853739.3178186-34081-159530588179398/AnsiballZ_systemd.py <<< 32935 1726853739.40099: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853739.3178186-34081-159530588179398/AnsiballZ_systemd.py" <<< 32935 1726853739.40137: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-32935vj31k4ae/tmpuc81z4g_" to remote "/root/.ansible/tmp/ansible-tmp-1726853739.3178186-34081-159530588179398/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853739.3178186-34081-159530588179398/AnsiballZ_systemd.py" <<< 32935 1726853739.42546: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853739.42550: stdout chunk (state=3): >>><<< 32935 1726853739.42553: stderr chunk (state=3): >>><<< 32935 1726853739.42555: done transferring module to remote 32935 1726853739.42557: _low_level_execute_command(): starting 32935 1726853739.42562: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853739.3178186-34081-159530588179398/ /root/.ansible/tmp/ansible-tmp-1726853739.3178186-34081-159530588179398/AnsiballZ_systemd.py && sleep 0' 32935 1726853739.43393: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853739.43400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853739.43419: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32935 1726853739.43425: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853739.43443: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration <<< 32935 1726853739.43449: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32935 1726853739.43463: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 32935 1726853739.43466: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853739.43633: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853739.43663: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853739.45508: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853739.45587: stderr chunk (state=3): >>><<< 32935 1726853739.45591: stdout chunk (state=3): >>><<< 32935 1726853739.45652: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853739.45655: _low_level_execute_command(): starting 32935 1726853739.45660: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853739.3178186-34081-159530588179398/AnsiballZ_systemd.py && sleep 0' 32935 1726853739.47077: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853739.47106: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853739.47208: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853739.76210: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ExecMainStartTimestampMonotonic": "13747067", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ExecMainHandoffTimestampMonotonic": "13825256", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2977", "MemoryCurrent": "10780672", "MemoryPeak": "14561280", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3317661696", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "1994010000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target dbus.socket system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target cloud-init.service NetworkManager-wait-online.service network.target shutdown.target", "After": "system.slice basic.target systemd-journald.socket cloud-init-local.service sysinit.target dbus.socket network-pre.target dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:33:02 EDT", "StateChangeTimestampMonotonic": "748756263", "InactiveExitTimestamp": "Fri 2024-09-20 13:20:47 EDT", "InactiveExitTimestampMonotonic": "13748890", "ActiveEnterTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ActiveEnterTimestampMonotonic": "14166608", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ConditionTimestampMonotonic": "13745559", "AssertTimestamp": "Fri 2024-09-20 13:20:47 EDT", "AssertTimestampMonotonic": "13745562", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "5f58decfa480494eac8aa3993b4c7ec8", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 32935 1726853739.78196: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 32935 1726853739.78199: stdout chunk (state=3): >>><<< 32935 1726853739.78202: stderr chunk (state=3): >>><<< 32935 1726853739.78220: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ExecMainStartTimestampMonotonic": "13747067", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ExecMainHandoffTimestampMonotonic": "13825256", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2977", "MemoryCurrent": "10780672", "MemoryPeak": "14561280", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3317661696", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "1994010000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target dbus.socket system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target cloud-init.service NetworkManager-wait-online.service network.target shutdown.target", "After": "system.slice basic.target systemd-journald.socket cloud-init-local.service sysinit.target dbus.socket network-pre.target dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:33:02 EDT", "StateChangeTimestampMonotonic": "748756263", "InactiveExitTimestamp": "Fri 2024-09-20 13:20:47 EDT", "InactiveExitTimestampMonotonic": "13748890", "ActiveEnterTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ActiveEnterTimestampMonotonic": "14166608", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ConditionTimestampMonotonic": "13745559", "AssertTimestamp": "Fri 2024-09-20 13:20:47 EDT", "AssertTimestampMonotonic": "13745562", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "5f58decfa480494eac8aa3993b4c7ec8", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 32935 1726853739.78497: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853739.3178186-34081-159530588179398/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32935 1726853739.78500: _low_level_execute_command(): starting 32935 1726853739.78503: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853739.3178186-34081-159530588179398/ > /dev/null 2>&1 && sleep 0' 32935 1726853739.79007: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853739.79011: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32935 1726853739.79041: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 32935 1726853739.79044: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration <<< 32935 1726853739.79047: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 32935 1726853739.79049: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853739.79098: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853739.79102: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853739.79170: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853739.81029: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853739.81032: stdout chunk (state=3): >>><<< 32935 1726853739.81035: stderr chunk (state=3): >>><<< 32935 1726853739.81055: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853739.81178: handler run complete 32935 1726853739.81182: attempt loop complete, returning result 32935 1726853739.81185: _execute() done 32935 1726853739.81187: dumping result to json 32935 1726853739.81188: done dumping result, returning 32935 1726853739.81190: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [02083763-bbaf-84df-441d-000000000071] 32935 1726853739.81192: sending task result for task 02083763-bbaf-84df-441d-000000000071 ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 32935 1726853739.81557: no more pending results, returning what we have 32935 1726853739.81560: results queue empty 32935 1726853739.81561: checking for any_errors_fatal 32935 1726853739.81566: done checking for any_errors_fatal 32935 1726853739.81567: checking for max_fail_percentage 32935 1726853739.81568: done checking for max_fail_percentage 32935 1726853739.81569: checking to see if all hosts have failed and the running result is not ok 32935 1726853739.81572: done checking to see if all hosts have failed 32935 1726853739.81573: getting the remaining hosts for this loop 32935 1726853739.81574: done getting the remaining hosts for this loop 32935 1726853739.81577: getting the next task for host managed_node1 32935 1726853739.81584: done getting next task for host managed_node1 32935 1726853739.81587: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 32935 1726853739.81590: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853739.81600: getting variables 32935 1726853739.81602: in VariableManager get_vars() 32935 1726853739.81666: Calling all_inventory to load vars for managed_node1 32935 1726853739.81669: Calling groups_inventory to load vars for managed_node1 32935 1726853739.81694: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853739.81701: done sending task result for task 02083763-bbaf-84df-441d-000000000071 32935 1726853739.81704: WORKER PROCESS EXITING 32935 1726853739.81712: Calling all_plugins_play to load vars for managed_node1 32935 1726853739.81715: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853739.81717: Calling groups_plugins_play to load vars for managed_node1 32935 1726853739.82979: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853739.83848: done with get_vars() 32935 1726853739.83866: done getting variables 32935 1726853739.83913: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 13:35:39 -0400 (0:00:00.734) 0:00:24.975 ****** 32935 1726853739.83936: entering _queue_task() for managed_node1/service 32935 1726853739.84182: worker is 1 (out of 1 available) 32935 1726853739.84196: exiting _queue_task() for managed_node1/service 32935 1726853739.84210: done queuing things up, now waiting for results queue to drain 32935 1726853739.84211: waiting for pending results... 32935 1726853739.84398: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 32935 1726853739.84491: in run() - task 02083763-bbaf-84df-441d-000000000072 32935 1726853739.84502: variable 'ansible_search_path' from source: unknown 32935 1726853739.84506: variable 'ansible_search_path' from source: unknown 32935 1726853739.84533: calling self._execute() 32935 1726853739.84613: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853739.84617: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853739.84627: variable 'omit' from source: magic vars 32935 1726853739.85176: variable 'ansible_distribution_major_version' from source: facts 32935 1726853739.85180: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853739.85182: variable 'network_provider' from source: set_fact 32935 1726853739.85185: Evaluated conditional (network_provider == "nm"): True 32935 1726853739.85233: variable '__network_wpa_supplicant_required' from source: role '' defaults 32935 1726853739.85323: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 32935 1726853739.85514: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32935 1726853739.91390: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32935 1726853739.91433: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32935 1726853739.91461: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32935 1726853739.91487: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32935 1726853739.91507: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32935 1726853739.91569: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32935 1726853739.91590: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32935 1726853739.91607: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853739.91636: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32935 1726853739.91647: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32935 1726853739.91683: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32935 1726853739.91698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32935 1726853739.91714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853739.91740: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32935 1726853739.91754: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32935 1726853739.91783: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32935 1726853739.91798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32935 1726853739.91814: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853739.91839: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32935 1726853739.91854: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32935 1726853739.91945: variable 'network_connections' from source: task vars 32935 1726853739.91954: variable 'interface' from source: play vars 32935 1726853739.92012: variable 'interface' from source: play vars 32935 1726853739.92020: variable 'vlan_interface' from source: play vars 32935 1726853739.92074: variable 'vlan_interface' from source: play vars 32935 1726853739.92119: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 32935 1726853739.92226: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 32935 1726853739.92252: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 32935 1726853739.92276: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 32935 1726853739.92299: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 32935 1726853739.92327: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 32935 1726853739.92342: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 32935 1726853739.92358: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853739.92379: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 32935 1726853739.92411: variable '__network_wireless_connections_defined' from source: role '' defaults 32935 1726853739.92568: variable 'network_connections' from source: task vars 32935 1726853739.92573: variable 'interface' from source: play vars 32935 1726853739.92616: variable 'interface' from source: play vars 32935 1726853739.92623: variable 'vlan_interface' from source: play vars 32935 1726853739.92668: variable 'vlan_interface' from source: play vars 32935 1726853739.92691: Evaluated conditional (__network_wpa_supplicant_required): False 32935 1726853739.92694: when evaluation is False, skipping this task 32935 1726853739.92704: _execute() done 32935 1726853739.92706: dumping result to json 32935 1726853739.92709: done dumping result, returning 32935 1726853739.92711: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [02083763-bbaf-84df-441d-000000000072] 32935 1726853739.92713: sending task result for task 02083763-bbaf-84df-441d-000000000072 32935 1726853739.92794: done sending task result for task 02083763-bbaf-84df-441d-000000000072 32935 1726853739.92797: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 32935 1726853739.92866: no more pending results, returning what we have 32935 1726853739.92869: results queue empty 32935 1726853739.92870: checking for any_errors_fatal 32935 1726853739.92885: done checking for any_errors_fatal 32935 1726853739.92885: checking for max_fail_percentage 32935 1726853739.92887: done checking for max_fail_percentage 32935 1726853739.92888: checking to see if all hosts have failed and the running result is not ok 32935 1726853739.92889: done checking to see if all hosts have failed 32935 1726853739.92889: getting the remaining hosts for this loop 32935 1726853739.92891: done getting the remaining hosts for this loop 32935 1726853739.92894: getting the next task for host managed_node1 32935 1726853739.92900: done getting next task for host managed_node1 32935 1726853739.92903: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 32935 1726853739.92906: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853739.92922: getting variables 32935 1726853739.92923: in VariableManager get_vars() 32935 1726853739.92963: Calling all_inventory to load vars for managed_node1 32935 1726853739.92966: Calling groups_inventory to load vars for managed_node1 32935 1726853739.92968: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853739.92978: Calling all_plugins_play to load vars for managed_node1 32935 1726853739.92981: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853739.92984: Calling groups_plugins_play to load vars for managed_node1 32935 1726853739.97227: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853739.98070: done with get_vars() 32935 1726853739.98089: done getting variables 32935 1726853739.98124: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 13:35:39 -0400 (0:00:00.142) 0:00:25.117 ****** 32935 1726853739.98143: entering _queue_task() for managed_node1/service 32935 1726853739.98409: worker is 1 (out of 1 available) 32935 1726853739.98422: exiting _queue_task() for managed_node1/service 32935 1726853739.98434: done queuing things up, now waiting for results queue to drain 32935 1726853739.98436: waiting for pending results... 32935 1726853739.98625: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 32935 1726853739.98723: in run() - task 02083763-bbaf-84df-441d-000000000073 32935 1726853739.98733: variable 'ansible_search_path' from source: unknown 32935 1726853739.98736: variable 'ansible_search_path' from source: unknown 32935 1726853739.98769: calling self._execute() 32935 1726853739.98849: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853739.98853: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853739.98864: variable 'omit' from source: magic vars 32935 1726853739.99146: variable 'ansible_distribution_major_version' from source: facts 32935 1726853739.99156: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853739.99235: variable 'network_provider' from source: set_fact 32935 1726853739.99238: Evaluated conditional (network_provider == "initscripts"): False 32935 1726853739.99241: when evaluation is False, skipping this task 32935 1726853739.99244: _execute() done 32935 1726853739.99248: dumping result to json 32935 1726853739.99252: done dumping result, returning 32935 1726853739.99258: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [02083763-bbaf-84df-441d-000000000073] 32935 1726853739.99265: sending task result for task 02083763-bbaf-84df-441d-000000000073 32935 1726853739.99357: done sending task result for task 02083763-bbaf-84df-441d-000000000073 32935 1726853739.99359: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 32935 1726853739.99401: no more pending results, returning what we have 32935 1726853739.99404: results queue empty 32935 1726853739.99405: checking for any_errors_fatal 32935 1726853739.99415: done checking for any_errors_fatal 32935 1726853739.99415: checking for max_fail_percentage 32935 1726853739.99417: done checking for max_fail_percentage 32935 1726853739.99418: checking to see if all hosts have failed and the running result is not ok 32935 1726853739.99419: done checking to see if all hosts have failed 32935 1726853739.99419: getting the remaining hosts for this loop 32935 1726853739.99421: done getting the remaining hosts for this loop 32935 1726853739.99424: getting the next task for host managed_node1 32935 1726853739.99432: done getting next task for host managed_node1 32935 1726853739.99436: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 32935 1726853739.99439: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853739.99459: getting variables 32935 1726853739.99461: in VariableManager get_vars() 32935 1726853739.99500: Calling all_inventory to load vars for managed_node1 32935 1726853739.99502: Calling groups_inventory to load vars for managed_node1 32935 1726853739.99505: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853739.99514: Calling all_plugins_play to load vars for managed_node1 32935 1726853739.99516: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853739.99519: Calling groups_plugins_play to load vars for managed_node1 32935 1726853740.00273: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853740.01141: done with get_vars() 32935 1726853740.01156: done getting variables 32935 1726853740.01199: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 13:35:40 -0400 (0:00:00.030) 0:00:25.147 ****** 32935 1726853740.01223: entering _queue_task() for managed_node1/copy 32935 1726853740.01445: worker is 1 (out of 1 available) 32935 1726853740.01459: exiting _queue_task() for managed_node1/copy 32935 1726853740.01474: done queuing things up, now waiting for results queue to drain 32935 1726853740.01476: waiting for pending results... 32935 1726853740.01660: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 32935 1726853740.01750: in run() - task 02083763-bbaf-84df-441d-000000000074 32935 1726853740.01761: variable 'ansible_search_path' from source: unknown 32935 1726853740.01767: variable 'ansible_search_path' from source: unknown 32935 1726853740.01801: calling self._execute() 32935 1726853740.01881: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853740.01885: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853740.01893: variable 'omit' from source: magic vars 32935 1726853740.02170: variable 'ansible_distribution_major_version' from source: facts 32935 1726853740.02181: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853740.02263: variable 'network_provider' from source: set_fact 32935 1726853740.02267: Evaluated conditional (network_provider == "initscripts"): False 32935 1726853740.02270: when evaluation is False, skipping this task 32935 1726853740.02274: _execute() done 32935 1726853740.02277: dumping result to json 32935 1726853740.02279: done dumping result, returning 32935 1726853740.02286: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [02083763-bbaf-84df-441d-000000000074] 32935 1726853740.02290: sending task result for task 02083763-bbaf-84df-441d-000000000074 32935 1726853740.02380: done sending task result for task 02083763-bbaf-84df-441d-000000000074 32935 1726853740.02383: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 32935 1726853740.02428: no more pending results, returning what we have 32935 1726853740.02431: results queue empty 32935 1726853740.02432: checking for any_errors_fatal 32935 1726853740.02436: done checking for any_errors_fatal 32935 1726853740.02437: checking for max_fail_percentage 32935 1726853740.02439: done checking for max_fail_percentage 32935 1726853740.02440: checking to see if all hosts have failed and the running result is not ok 32935 1726853740.02441: done checking to see if all hosts have failed 32935 1726853740.02442: getting the remaining hosts for this loop 32935 1726853740.02443: done getting the remaining hosts for this loop 32935 1726853740.02446: getting the next task for host managed_node1 32935 1726853740.02453: done getting next task for host managed_node1 32935 1726853740.02457: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 32935 1726853740.02462: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853740.02480: getting variables 32935 1726853740.02482: in VariableManager get_vars() 32935 1726853740.02515: Calling all_inventory to load vars for managed_node1 32935 1726853740.02517: Calling groups_inventory to load vars for managed_node1 32935 1726853740.02519: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853740.02527: Calling all_plugins_play to load vars for managed_node1 32935 1726853740.02529: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853740.02532: Calling groups_plugins_play to load vars for managed_node1 32935 1726853740.03451: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853740.04715: done with get_vars() 32935 1726853740.04732: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 13:35:40 -0400 (0:00:00.035) 0:00:25.183 ****** 32935 1726853740.04795: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 32935 1726853740.05027: worker is 1 (out of 1 available) 32935 1726853740.05042: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 32935 1726853740.05054: done queuing things up, now waiting for results queue to drain 32935 1726853740.05056: waiting for pending results... 32935 1726853740.05235: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 32935 1726853740.05335: in run() - task 02083763-bbaf-84df-441d-000000000075 32935 1726853740.05345: variable 'ansible_search_path' from source: unknown 32935 1726853740.05349: variable 'ansible_search_path' from source: unknown 32935 1726853740.05379: calling self._execute() 32935 1726853740.05453: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853740.05457: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853740.05467: variable 'omit' from source: magic vars 32935 1726853740.05745: variable 'ansible_distribution_major_version' from source: facts 32935 1726853740.05755: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853740.05763: variable 'omit' from source: magic vars 32935 1726853740.05799: variable 'omit' from source: magic vars 32935 1726853740.05910: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32935 1726853740.08077: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32935 1726853740.08081: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32935 1726853740.08083: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32935 1726853740.08088: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32935 1726853740.08120: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32935 1726853740.08212: variable 'network_provider' from source: set_fact 32935 1726853740.08345: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32935 1726853740.08383: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32935 1726853740.08417: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853740.08462: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32935 1726853740.08486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32935 1726853740.08558: variable 'omit' from source: magic vars 32935 1726853740.08674: variable 'omit' from source: magic vars 32935 1726853740.08775: variable 'network_connections' from source: task vars 32935 1726853740.08792: variable 'interface' from source: play vars 32935 1726853740.08853: variable 'interface' from source: play vars 32935 1726853740.08867: variable 'vlan_interface' from source: play vars 32935 1726853740.08930: variable 'vlan_interface' from source: play vars 32935 1726853740.09082: variable 'omit' from source: magic vars 32935 1726853740.09095: variable '__lsr_ansible_managed' from source: task vars 32935 1726853740.09155: variable '__lsr_ansible_managed' from source: task vars 32935 1726853740.09423: Loaded config def from plugin (lookup/template) 32935 1726853740.09433: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 32935 1726853740.09466: File lookup term: get_ansible_managed.j2 32935 1726853740.09476: variable 'ansible_search_path' from source: unknown 32935 1726853740.09484: evaluation_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 32935 1726853740.09499: search_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 32935 1726853740.09520: variable 'ansible_search_path' from source: unknown 32935 1726853740.15692: variable 'ansible_managed' from source: unknown 32935 1726853740.15876: variable 'omit' from source: magic vars 32935 1726853740.15879: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32935 1726853740.15886: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32935 1726853740.15902: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32935 1726853740.15924: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853740.15938: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853740.15969: variable 'inventory_hostname' from source: host vars for 'managed_node1' 32935 1726853740.15980: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853740.15988: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853740.16082: Set connection var ansible_timeout to 10 32935 1726853740.16094: Set connection var ansible_shell_type to sh 32935 1726853740.16105: Set connection var ansible_pipelining to False 32935 1726853740.16111: Set connection var ansible_connection to ssh 32935 1726853740.16120: Set connection var ansible_shell_executable to /bin/sh 32935 1726853740.16128: Set connection var ansible_module_compression to ZIP_DEFLATED 32935 1726853740.16155: variable 'ansible_shell_executable' from source: unknown 32935 1726853740.16162: variable 'ansible_connection' from source: unknown 32935 1726853740.16168: variable 'ansible_module_compression' from source: unknown 32935 1726853740.16177: variable 'ansible_shell_type' from source: unknown 32935 1726853740.16183: variable 'ansible_shell_executable' from source: unknown 32935 1726853740.16189: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853740.16196: variable 'ansible_pipelining' from source: unknown 32935 1726853740.16202: variable 'ansible_timeout' from source: unknown 32935 1726853740.16210: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853740.16512: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 32935 1726853740.16523: variable 'omit' from source: magic vars 32935 1726853740.16526: starting attempt loop 32935 1726853740.16529: running the handler 32935 1726853740.16534: _low_level_execute_command(): starting 32935 1726853740.16536: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32935 1726853740.17199: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853740.17240: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853740.17261: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853740.17295: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853740.17391: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853740.19538: stdout chunk (state=3): >>>/root <<< 32935 1726853740.19544: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853740.19548: stdout chunk (state=3): >>><<< 32935 1726853740.19551: stderr chunk (state=3): >>><<< 32935 1726853740.19555: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853740.19561: _low_level_execute_command(): starting 32935 1726853740.19566: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853740.1943328-34121-278997064051632 `" && echo ansible-tmp-1726853740.1943328-34121-278997064051632="` echo /root/.ansible/tmp/ansible-tmp-1726853740.1943328-34121-278997064051632 `" ) && sleep 0' 32935 1726853740.20392: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853740.20481: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853740.20521: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853740.20738: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853740.20816: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853740.22720: stdout chunk (state=3): >>>ansible-tmp-1726853740.1943328-34121-278997064051632=/root/.ansible/tmp/ansible-tmp-1726853740.1943328-34121-278997064051632 <<< 32935 1726853740.22918: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853740.22967: stderr chunk (state=3): >>><<< 32935 1726853740.22974: stdout chunk (state=3): >>><<< 32935 1726853740.22995: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853740.1943328-34121-278997064051632=/root/.ansible/tmp/ansible-tmp-1726853740.1943328-34121-278997064051632 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853740.23238: variable 'ansible_module_compression' from source: unknown 32935 1726853740.23243: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-32935vj31k4ae/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 32935 1726853740.23269: variable 'ansible_facts' from source: unknown 32935 1726853740.23466: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853740.1943328-34121-278997064051632/AnsiballZ_network_connections.py 32935 1726853740.23580: Sending initial data 32935 1726853740.23686: Sent initial data (168 bytes) 32935 1726853740.24249: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32935 1726853740.24268: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853740.24332: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853740.24403: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853740.24433: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853740.24455: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853740.24539: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853740.26070: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32935 1726853740.26129: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32935 1726853740.26185: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32935vj31k4ae/tmp39f79lxl /root/.ansible/tmp/ansible-tmp-1726853740.1943328-34121-278997064051632/AnsiballZ_network_connections.py <<< 32935 1726853740.26195: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853740.1943328-34121-278997064051632/AnsiballZ_network_connections.py" <<< 32935 1726853740.26247: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-32935vj31k4ae/tmp39f79lxl" to remote "/root/.ansible/tmp/ansible-tmp-1726853740.1943328-34121-278997064051632/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853740.1943328-34121-278997064051632/AnsiballZ_network_connections.py" <<< 32935 1726853740.27286: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853740.27290: stdout chunk (state=3): >>><<< 32935 1726853740.27292: stderr chunk (state=3): >>><<< 32935 1726853740.27318: done transferring module to remote 32935 1726853740.27395: _low_level_execute_command(): starting 32935 1726853740.27398: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853740.1943328-34121-278997064051632/ /root/.ansible/tmp/ansible-tmp-1726853740.1943328-34121-278997064051632/AnsiballZ_network_connections.py && sleep 0' 32935 1726853740.27978: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32935 1726853740.27993: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853740.28006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853740.28022: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32935 1726853740.28052: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853740.28142: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853740.28169: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853740.28237: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853740.30021: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853740.30045: stdout chunk (state=3): >>><<< 32935 1726853740.30047: stderr chunk (state=3): >>><<< 32935 1726853740.30063: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853740.30070: _low_level_execute_command(): starting 32935 1726853740.30151: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853740.1943328-34121-278997064051632/AnsiballZ_network_connections.py && sleep 0' 32935 1726853740.30677: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32935 1726853740.30692: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853740.30705: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853740.30727: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32935 1726853740.30792: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853740.30844: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853740.30861: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853740.30883: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853740.30959: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853740.69086: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_11_rt1bd/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_11_rt1bd/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on lsr101/e9b344ac-7aa9-4d34-9c01-f1b4dd46183f: error=unknown <<< 32935 1726853740.70744: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_11_rt1bd/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_11_rt1bd/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on lsr101.90/84991582-42ea-41a9-ba62-c7b3edc4be1a: error=unknown <<< 32935 1726853740.70951: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr101", "persistent_state": "absent", "state": "down"}, {"name": "lsr101.90", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr101", "persistent_state": "absent", "state": "down"}, {"name": "lsr101.90", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 32935 1726853740.72841: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 32935 1726853740.72867: stderr chunk (state=3): >>><<< 32935 1726853740.72872: stdout chunk (state=3): >>><<< 32935 1726853740.72888: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_11_rt1bd/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_11_rt1bd/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on lsr101/e9b344ac-7aa9-4d34-9c01-f1b4dd46183f: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_11_rt1bd/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_11_rt1bd/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on lsr101.90/84991582-42ea-41a9-ba62-c7b3edc4be1a: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr101", "persistent_state": "absent", "state": "down"}, {"name": "lsr101.90", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr101", "persistent_state": "absent", "state": "down"}, {"name": "lsr101.90", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 32935 1726853740.72918: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'lsr101', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'lsr101.90', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853740.1943328-34121-278997064051632/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32935 1726853740.72925: _low_level_execute_command(): starting 32935 1726853740.72930: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853740.1943328-34121-278997064051632/ > /dev/null 2>&1 && sleep 0' 32935 1726853740.73337: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853740.73365: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32935 1726853740.73368: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 32935 1726853740.73370: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853740.73375: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853740.73378: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853740.73432: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853740.73444: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853740.73445: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853740.73481: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853740.75354: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853740.75380: stderr chunk (state=3): >>><<< 32935 1726853740.75383: stdout chunk (state=3): >>><<< 32935 1726853740.75396: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853740.75401: handler run complete 32935 1726853740.75421: attempt loop complete, returning result 32935 1726853740.75424: _execute() done 32935 1726853740.75426: dumping result to json 32935 1726853740.75431: done dumping result, returning 32935 1726853740.75440: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [02083763-bbaf-84df-441d-000000000075] 32935 1726853740.75442: sending task result for task 02083763-bbaf-84df-441d-000000000075 32935 1726853740.75543: done sending task result for task 02083763-bbaf-84df-441d-000000000075 32935 1726853740.75546: WORKER PROCESS EXITING changed: [managed_node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "lsr101", "persistent_state": "absent", "state": "down" }, { "name": "lsr101.90", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 32935 1726853740.75669: no more pending results, returning what we have 32935 1726853740.75674: results queue empty 32935 1726853740.75675: checking for any_errors_fatal 32935 1726853740.75681: done checking for any_errors_fatal 32935 1726853740.75682: checking for max_fail_percentage 32935 1726853740.75684: done checking for max_fail_percentage 32935 1726853740.75684: checking to see if all hosts have failed and the running result is not ok 32935 1726853740.75685: done checking to see if all hosts have failed 32935 1726853740.75686: getting the remaining hosts for this loop 32935 1726853740.75688: done getting the remaining hosts for this loop 32935 1726853740.75691: getting the next task for host managed_node1 32935 1726853740.75697: done getting next task for host managed_node1 32935 1726853740.75701: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 32935 1726853740.75703: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853740.75713: getting variables 32935 1726853740.75714: in VariableManager get_vars() 32935 1726853740.75751: Calling all_inventory to load vars for managed_node1 32935 1726853740.75753: Calling groups_inventory to load vars for managed_node1 32935 1726853740.75755: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853740.75767: Calling all_plugins_play to load vars for managed_node1 32935 1726853740.75769: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853740.75779: Calling groups_plugins_play to load vars for managed_node1 32935 1726853740.76589: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853740.77574: done with get_vars() 32935 1726853740.77590: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 13:35:40 -0400 (0:00:00.728) 0:00:25.912 ****** 32935 1726853740.77651: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 32935 1726853740.77892: worker is 1 (out of 1 available) 32935 1726853740.77908: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 32935 1726853740.77919: done queuing things up, now waiting for results queue to drain 32935 1726853740.77921: waiting for pending results... 32935 1726853740.78100: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 32935 1726853740.78199: in run() - task 02083763-bbaf-84df-441d-000000000076 32935 1726853740.78210: variable 'ansible_search_path' from source: unknown 32935 1726853740.78214: variable 'ansible_search_path' from source: unknown 32935 1726853740.78244: calling self._execute() 32935 1726853740.78322: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853740.78326: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853740.78335: variable 'omit' from source: magic vars 32935 1726853740.78610: variable 'ansible_distribution_major_version' from source: facts 32935 1726853740.78619: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853740.78704: variable 'network_state' from source: role '' defaults 32935 1726853740.78711: Evaluated conditional (network_state != {}): False 32935 1726853740.78713: when evaluation is False, skipping this task 32935 1726853740.78716: _execute() done 32935 1726853740.78719: dumping result to json 32935 1726853740.78722: done dumping result, returning 32935 1726853740.78730: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [02083763-bbaf-84df-441d-000000000076] 32935 1726853740.78735: sending task result for task 02083763-bbaf-84df-441d-000000000076 32935 1726853740.78819: done sending task result for task 02083763-bbaf-84df-441d-000000000076 32935 1726853740.78821: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 32935 1726853740.78872: no more pending results, returning what we have 32935 1726853740.78877: results queue empty 32935 1726853740.78878: checking for any_errors_fatal 32935 1726853740.78888: done checking for any_errors_fatal 32935 1726853740.78889: checking for max_fail_percentage 32935 1726853740.78891: done checking for max_fail_percentage 32935 1726853740.78892: checking to see if all hosts have failed and the running result is not ok 32935 1726853740.78893: done checking to see if all hosts have failed 32935 1726853740.78894: getting the remaining hosts for this loop 32935 1726853740.78895: done getting the remaining hosts for this loop 32935 1726853740.78898: getting the next task for host managed_node1 32935 1726853740.78906: done getting next task for host managed_node1 32935 1726853740.78909: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 32935 1726853740.78912: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853740.78928: getting variables 32935 1726853740.78930: in VariableManager get_vars() 32935 1726853740.78965: Calling all_inventory to load vars for managed_node1 32935 1726853740.78967: Calling groups_inventory to load vars for managed_node1 32935 1726853740.78969: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853740.78990: Calling all_plugins_play to load vars for managed_node1 32935 1726853740.78993: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853740.78996: Calling groups_plugins_play to load vars for managed_node1 32935 1726853740.79746: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853740.80616: done with get_vars() 32935 1726853740.80630: done getting variables 32935 1726853740.80675: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 13:35:40 -0400 (0:00:00.030) 0:00:25.942 ****** 32935 1726853740.80698: entering _queue_task() for managed_node1/debug 32935 1726853740.80904: worker is 1 (out of 1 available) 32935 1726853740.80919: exiting _queue_task() for managed_node1/debug 32935 1726853740.80930: done queuing things up, now waiting for results queue to drain 32935 1726853740.80932: waiting for pending results... 32935 1726853740.81097: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 32935 1726853740.81184: in run() - task 02083763-bbaf-84df-441d-000000000077 32935 1726853740.81195: variable 'ansible_search_path' from source: unknown 32935 1726853740.81198: variable 'ansible_search_path' from source: unknown 32935 1726853740.81225: calling self._execute() 32935 1726853740.81304: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853740.81308: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853740.81317: variable 'omit' from source: magic vars 32935 1726853740.81583: variable 'ansible_distribution_major_version' from source: facts 32935 1726853740.81597: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853740.81600: variable 'omit' from source: magic vars 32935 1726853740.81638: variable 'omit' from source: magic vars 32935 1726853740.81664: variable 'omit' from source: magic vars 32935 1726853740.81698: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32935 1726853740.81727: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32935 1726853740.81744: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32935 1726853740.81756: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853740.81767: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853740.81791: variable 'inventory_hostname' from source: host vars for 'managed_node1' 32935 1726853740.81794: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853740.81797: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853740.81868: Set connection var ansible_timeout to 10 32935 1726853740.81874: Set connection var ansible_shell_type to sh 32935 1726853740.81881: Set connection var ansible_pipelining to False 32935 1726853740.81883: Set connection var ansible_connection to ssh 32935 1726853740.81888: Set connection var ansible_shell_executable to /bin/sh 32935 1726853740.81894: Set connection var ansible_module_compression to ZIP_DEFLATED 32935 1726853740.81911: variable 'ansible_shell_executable' from source: unknown 32935 1726853740.81914: variable 'ansible_connection' from source: unknown 32935 1726853740.81918: variable 'ansible_module_compression' from source: unknown 32935 1726853740.81921: variable 'ansible_shell_type' from source: unknown 32935 1726853740.81924: variable 'ansible_shell_executable' from source: unknown 32935 1726853740.81928: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853740.81930: variable 'ansible_pipelining' from source: unknown 32935 1726853740.81932: variable 'ansible_timeout' from source: unknown 32935 1726853740.81934: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853740.82032: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32935 1726853740.82041: variable 'omit' from source: magic vars 32935 1726853740.82046: starting attempt loop 32935 1726853740.82049: running the handler 32935 1726853740.82139: variable '__network_connections_result' from source: set_fact 32935 1726853740.82181: handler run complete 32935 1726853740.82195: attempt loop complete, returning result 32935 1726853740.82198: _execute() done 32935 1726853740.82201: dumping result to json 32935 1726853740.82203: done dumping result, returning 32935 1726853740.82211: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [02083763-bbaf-84df-441d-000000000077] 32935 1726853740.82214: sending task result for task 02083763-bbaf-84df-441d-000000000077 32935 1726853740.82298: done sending task result for task 02083763-bbaf-84df-441d-000000000077 32935 1726853740.82300: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result.stderr_lines": [ "" ] } 32935 1726853740.82362: no more pending results, returning what we have 32935 1726853740.82366: results queue empty 32935 1726853740.82367: checking for any_errors_fatal 32935 1726853740.82376: done checking for any_errors_fatal 32935 1726853740.82377: checking for max_fail_percentage 32935 1726853740.82378: done checking for max_fail_percentage 32935 1726853740.82379: checking to see if all hosts have failed and the running result is not ok 32935 1726853740.82380: done checking to see if all hosts have failed 32935 1726853740.82381: getting the remaining hosts for this loop 32935 1726853740.82383: done getting the remaining hosts for this loop 32935 1726853740.82386: getting the next task for host managed_node1 32935 1726853740.82393: done getting next task for host managed_node1 32935 1726853740.82396: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 32935 1726853740.82399: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853740.82409: getting variables 32935 1726853740.82410: in VariableManager get_vars() 32935 1726853740.82442: Calling all_inventory to load vars for managed_node1 32935 1726853740.82444: Calling groups_inventory to load vars for managed_node1 32935 1726853740.82446: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853740.82453: Calling all_plugins_play to load vars for managed_node1 32935 1726853740.82456: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853740.82461: Calling groups_plugins_play to load vars for managed_node1 32935 1726853740.83370: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853740.84218: done with get_vars() 32935 1726853740.84232: done getting variables 32935 1726853740.84276: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 13:35:40 -0400 (0:00:00.035) 0:00:25.978 ****** 32935 1726853740.84298: entering _queue_task() for managed_node1/debug 32935 1726853740.84517: worker is 1 (out of 1 available) 32935 1726853740.84531: exiting _queue_task() for managed_node1/debug 32935 1726853740.84542: done queuing things up, now waiting for results queue to drain 32935 1726853740.84544: waiting for pending results... 32935 1726853740.84717: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 32935 1726853740.84805: in run() - task 02083763-bbaf-84df-441d-000000000078 32935 1726853740.84817: variable 'ansible_search_path' from source: unknown 32935 1726853740.84820: variable 'ansible_search_path' from source: unknown 32935 1726853740.84847: calling self._execute() 32935 1726853740.84925: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853740.84929: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853740.84937: variable 'omit' from source: magic vars 32935 1726853740.85213: variable 'ansible_distribution_major_version' from source: facts 32935 1726853740.85221: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853740.85224: variable 'omit' from source: magic vars 32935 1726853740.85263: variable 'omit' from source: magic vars 32935 1726853740.85287: variable 'omit' from source: magic vars 32935 1726853740.85320: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32935 1726853740.85349: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32935 1726853740.85366: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32935 1726853740.85381: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853740.85390: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853740.85413: variable 'inventory_hostname' from source: host vars for 'managed_node1' 32935 1726853740.85415: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853740.85418: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853740.85490: Set connection var ansible_timeout to 10 32935 1726853740.85494: Set connection var ansible_shell_type to sh 32935 1726853740.85501: Set connection var ansible_pipelining to False 32935 1726853740.85504: Set connection var ansible_connection to ssh 32935 1726853740.85509: Set connection var ansible_shell_executable to /bin/sh 32935 1726853740.85514: Set connection var ansible_module_compression to ZIP_DEFLATED 32935 1726853740.85533: variable 'ansible_shell_executable' from source: unknown 32935 1726853740.85537: variable 'ansible_connection' from source: unknown 32935 1726853740.85542: variable 'ansible_module_compression' from source: unknown 32935 1726853740.85545: variable 'ansible_shell_type' from source: unknown 32935 1726853740.85547: variable 'ansible_shell_executable' from source: unknown 32935 1726853740.85550: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853740.85552: variable 'ansible_pipelining' from source: unknown 32935 1726853740.85554: variable 'ansible_timeout' from source: unknown 32935 1726853740.85556: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853740.85652: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32935 1726853740.85662: variable 'omit' from source: magic vars 32935 1726853740.85665: starting attempt loop 32935 1726853740.85679: running the handler 32935 1726853740.85709: variable '__network_connections_result' from source: set_fact 32935 1726853740.85763: variable '__network_connections_result' from source: set_fact 32935 1726853740.85841: handler run complete 32935 1726853740.85857: attempt loop complete, returning result 32935 1726853740.85863: _execute() done 32935 1726853740.85865: dumping result to json 32935 1726853740.85868: done dumping result, returning 32935 1726853740.85876: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [02083763-bbaf-84df-441d-000000000078] 32935 1726853740.85882: sending task result for task 02083763-bbaf-84df-441d-000000000078 32935 1726853740.85965: done sending task result for task 02083763-bbaf-84df-441d-000000000078 32935 1726853740.85968: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "lsr101", "persistent_state": "absent", "state": "down" }, { "name": "lsr101.90", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 32935 1726853740.86070: no more pending results, returning what we have 32935 1726853740.86075: results queue empty 32935 1726853740.86076: checking for any_errors_fatal 32935 1726853740.86080: done checking for any_errors_fatal 32935 1726853740.86081: checking for max_fail_percentage 32935 1726853740.86082: done checking for max_fail_percentage 32935 1726853740.86083: checking to see if all hosts have failed and the running result is not ok 32935 1726853740.86084: done checking to see if all hosts have failed 32935 1726853740.86084: getting the remaining hosts for this loop 32935 1726853740.86086: done getting the remaining hosts for this loop 32935 1726853740.86088: getting the next task for host managed_node1 32935 1726853740.86094: done getting next task for host managed_node1 32935 1726853740.86097: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 32935 1726853740.86101: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853740.86111: getting variables 32935 1726853740.86112: in VariableManager get_vars() 32935 1726853740.86143: Calling all_inventory to load vars for managed_node1 32935 1726853740.86146: Calling groups_inventory to load vars for managed_node1 32935 1726853740.86147: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853740.86155: Calling all_plugins_play to load vars for managed_node1 32935 1726853740.86160: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853740.86163: Calling groups_plugins_play to load vars for managed_node1 32935 1726853740.86898: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853740.87764: done with get_vars() 32935 1726853740.87781: done getting variables 32935 1726853740.87822: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 13:35:40 -0400 (0:00:00.035) 0:00:26.014 ****** 32935 1726853740.87848: entering _queue_task() for managed_node1/debug 32935 1726853740.88065: worker is 1 (out of 1 available) 32935 1726853740.88082: exiting _queue_task() for managed_node1/debug 32935 1726853740.88093: done queuing things up, now waiting for results queue to drain 32935 1726853740.88095: waiting for pending results... 32935 1726853740.88285: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 32935 1726853740.88385: in run() - task 02083763-bbaf-84df-441d-000000000079 32935 1726853740.88396: variable 'ansible_search_path' from source: unknown 32935 1726853740.88400: variable 'ansible_search_path' from source: unknown 32935 1726853740.88430: calling self._execute() 32935 1726853740.88515: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853740.88519: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853740.88527: variable 'omit' from source: magic vars 32935 1726853740.88808: variable 'ansible_distribution_major_version' from source: facts 32935 1726853740.88817: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853740.88904: variable 'network_state' from source: role '' defaults 32935 1726853740.88912: Evaluated conditional (network_state != {}): False 32935 1726853740.88915: when evaluation is False, skipping this task 32935 1726853740.88917: _execute() done 32935 1726853740.88920: dumping result to json 32935 1726853740.88924: done dumping result, returning 32935 1726853740.88932: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [02083763-bbaf-84df-441d-000000000079] 32935 1726853740.88937: sending task result for task 02083763-bbaf-84df-441d-000000000079 32935 1726853740.89021: done sending task result for task 02083763-bbaf-84df-441d-000000000079 32935 1726853740.89024: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "network_state != {}" } 32935 1726853740.89066: no more pending results, returning what we have 32935 1726853740.89069: results queue empty 32935 1726853740.89070: checking for any_errors_fatal 32935 1726853740.89078: done checking for any_errors_fatal 32935 1726853740.89079: checking for max_fail_percentage 32935 1726853740.89081: done checking for max_fail_percentage 32935 1726853740.89082: checking to see if all hosts have failed and the running result is not ok 32935 1726853740.89083: done checking to see if all hosts have failed 32935 1726853740.89084: getting the remaining hosts for this loop 32935 1726853740.89085: done getting the remaining hosts for this loop 32935 1726853740.89089: getting the next task for host managed_node1 32935 1726853740.89096: done getting next task for host managed_node1 32935 1726853740.89101: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 32935 1726853740.89104: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853740.89120: getting variables 32935 1726853740.89122: in VariableManager get_vars() 32935 1726853740.89156: Calling all_inventory to load vars for managed_node1 32935 1726853740.89158: Calling groups_inventory to load vars for managed_node1 32935 1726853740.89160: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853740.89169: Calling all_plugins_play to load vars for managed_node1 32935 1726853740.89173: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853740.89176: Calling groups_plugins_play to load vars for managed_node1 32935 1726853740.90040: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853740.90899: done with get_vars() 32935 1726853740.90913: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 13:35:40 -0400 (0:00:00.031) 0:00:26.045 ****** 32935 1726853740.90982: entering _queue_task() for managed_node1/ping 32935 1726853740.91215: worker is 1 (out of 1 available) 32935 1726853740.91228: exiting _queue_task() for managed_node1/ping 32935 1726853740.91240: done queuing things up, now waiting for results queue to drain 32935 1726853740.91242: waiting for pending results... 32935 1726853740.91422: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 32935 1726853740.91516: in run() - task 02083763-bbaf-84df-441d-00000000007a 32935 1726853740.91527: variable 'ansible_search_path' from source: unknown 32935 1726853740.91530: variable 'ansible_search_path' from source: unknown 32935 1726853740.91558: calling self._execute() 32935 1726853740.91638: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853740.91642: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853740.91650: variable 'omit' from source: magic vars 32935 1726853740.91933: variable 'ansible_distribution_major_version' from source: facts 32935 1726853740.91943: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853740.91948: variable 'omit' from source: magic vars 32935 1726853740.91987: variable 'omit' from source: magic vars 32935 1726853740.92016: variable 'omit' from source: magic vars 32935 1726853740.92043: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32935 1726853740.92075: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32935 1726853740.92091: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32935 1726853740.92106: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853740.92122: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853740.92141: variable 'inventory_hostname' from source: host vars for 'managed_node1' 32935 1726853740.92144: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853740.92146: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853740.92216: Set connection var ansible_timeout to 10 32935 1726853740.92219: Set connection var ansible_shell_type to sh 32935 1726853740.92230: Set connection var ansible_pipelining to False 32935 1726853740.92235: Set connection var ansible_connection to ssh 32935 1726853740.92238: Set connection var ansible_shell_executable to /bin/sh 32935 1726853740.92240: Set connection var ansible_module_compression to ZIP_DEFLATED 32935 1726853740.92264: variable 'ansible_shell_executable' from source: unknown 32935 1726853740.92267: variable 'ansible_connection' from source: unknown 32935 1726853740.92270: variable 'ansible_module_compression' from source: unknown 32935 1726853740.92274: variable 'ansible_shell_type' from source: unknown 32935 1726853740.92276: variable 'ansible_shell_executable' from source: unknown 32935 1726853740.92278: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853740.92281: variable 'ansible_pipelining' from source: unknown 32935 1726853740.92283: variable 'ansible_timeout' from source: unknown 32935 1726853740.92286: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853740.92432: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 32935 1726853740.92442: variable 'omit' from source: magic vars 32935 1726853740.92445: starting attempt loop 32935 1726853740.92448: running the handler 32935 1726853740.92464: _low_level_execute_command(): starting 32935 1726853740.92472: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32935 1726853740.92981: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853740.92985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration <<< 32935 1726853740.92988: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853740.92990: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853740.93044: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853740.93047: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853740.93049: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853740.93102: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853740.94782: stdout chunk (state=3): >>>/root <<< 32935 1726853740.94887: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853740.94913: stderr chunk (state=3): >>><<< 32935 1726853740.94916: stdout chunk (state=3): >>><<< 32935 1726853740.94939: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853740.94951: _low_level_execute_command(): starting 32935 1726853740.94956: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853740.9493797-34155-278777915751938 `" && echo ansible-tmp-1726853740.9493797-34155-278777915751938="` echo /root/.ansible/tmp/ansible-tmp-1726853740.9493797-34155-278777915751938 `" ) && sleep 0' 32935 1726853740.95386: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853740.95389: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853740.95399: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853740.95401: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853740.95448: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853740.95454: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853740.95456: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853740.95494: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853740.97402: stdout chunk (state=3): >>>ansible-tmp-1726853740.9493797-34155-278777915751938=/root/.ansible/tmp/ansible-tmp-1726853740.9493797-34155-278777915751938 <<< 32935 1726853740.97504: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853740.97529: stderr chunk (state=3): >>><<< 32935 1726853740.97532: stdout chunk (state=3): >>><<< 32935 1726853740.97548: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853740.9493797-34155-278777915751938=/root/.ansible/tmp/ansible-tmp-1726853740.9493797-34155-278777915751938 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853740.97591: variable 'ansible_module_compression' from source: unknown 32935 1726853740.97622: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-32935vj31k4ae/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 32935 1726853740.97650: variable 'ansible_facts' from source: unknown 32935 1726853740.97707: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853740.9493797-34155-278777915751938/AnsiballZ_ping.py 32935 1726853740.97809: Sending initial data 32935 1726853740.97812: Sent initial data (153 bytes) 32935 1726853740.98243: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853740.98246: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853740.98249: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853740.98251: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853740.98308: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853740.98318: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853740.98326: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853740.98350: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853740.99911: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 32935 1726853740.99918: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32935 1726853740.99948: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32935 1726853740.99987: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32935vj31k4ae/tmpv0l_bcex /root/.ansible/tmp/ansible-tmp-1726853740.9493797-34155-278777915751938/AnsiballZ_ping.py <<< 32935 1726853740.99990: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853740.9493797-34155-278777915751938/AnsiballZ_ping.py" <<< 32935 1726853741.00024: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-32935vj31k4ae/tmpv0l_bcex" to remote "/root/.ansible/tmp/ansible-tmp-1726853740.9493797-34155-278777915751938/AnsiballZ_ping.py" <<< 32935 1726853741.00027: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853740.9493797-34155-278777915751938/AnsiballZ_ping.py" <<< 32935 1726853741.00514: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853741.00552: stderr chunk (state=3): >>><<< 32935 1726853741.00556: stdout chunk (state=3): >>><<< 32935 1726853741.00600: done transferring module to remote 32935 1726853741.00609: _low_level_execute_command(): starting 32935 1726853741.00613: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853740.9493797-34155-278777915751938/ /root/.ansible/tmp/ansible-tmp-1726853740.9493797-34155-278777915751938/AnsiballZ_ping.py && sleep 0' 32935 1726853741.01052: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853741.01056: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 32935 1726853741.01058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 32935 1726853741.01060: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853741.01065: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853741.01120: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853741.01126: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853741.01128: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853741.01163: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853741.02894: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853741.02918: stderr chunk (state=3): >>><<< 32935 1726853741.02921: stdout chunk (state=3): >>><<< 32935 1726853741.02934: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853741.02937: _low_level_execute_command(): starting 32935 1726853741.02941: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853740.9493797-34155-278777915751938/AnsiballZ_ping.py && sleep 0' 32935 1726853741.03375: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853741.03380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 32935 1726853741.03382: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853741.03384: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853741.03386: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853741.03432: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853741.03439: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853741.03484: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853741.18775: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 32935 1726853741.20091: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 32935 1726853741.20120: stderr chunk (state=3): >>><<< 32935 1726853741.20124: stdout chunk (state=3): >>><<< 32935 1726853741.20138: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 32935 1726853741.20164: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853740.9493797-34155-278777915751938/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32935 1726853741.20170: _low_level_execute_command(): starting 32935 1726853741.20177: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853740.9493797-34155-278777915751938/ > /dev/null 2>&1 && sleep 0' 32935 1726853741.20628: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853741.20632: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 32935 1726853741.20634: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853741.20639: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853741.20641: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853741.20697: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853741.20706: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853741.20713: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853741.20743: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853741.22589: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853741.22612: stderr chunk (state=3): >>><<< 32935 1726853741.22615: stdout chunk (state=3): >>><<< 32935 1726853741.22631: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853741.22639: handler run complete 32935 1726853741.22652: attempt loop complete, returning result 32935 1726853741.22655: _execute() done 32935 1726853741.22657: dumping result to json 32935 1726853741.22663: done dumping result, returning 32935 1726853741.22673: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [02083763-bbaf-84df-441d-00000000007a] 32935 1726853741.22675: sending task result for task 02083763-bbaf-84df-441d-00000000007a 32935 1726853741.22763: done sending task result for task 02083763-bbaf-84df-441d-00000000007a 32935 1726853741.22766: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "ping": "pong" } 32935 1726853741.22829: no more pending results, returning what we have 32935 1726853741.22833: results queue empty 32935 1726853741.22834: checking for any_errors_fatal 32935 1726853741.22841: done checking for any_errors_fatal 32935 1726853741.22842: checking for max_fail_percentage 32935 1726853741.22844: done checking for max_fail_percentage 32935 1726853741.22844: checking to see if all hosts have failed and the running result is not ok 32935 1726853741.22846: done checking to see if all hosts have failed 32935 1726853741.22846: getting the remaining hosts for this loop 32935 1726853741.22848: done getting the remaining hosts for this loop 32935 1726853741.22851: getting the next task for host managed_node1 32935 1726853741.22861: done getting next task for host managed_node1 32935 1726853741.22863: ^ task is: TASK: meta (role_complete) 32935 1726853741.22865: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853741.22879: getting variables 32935 1726853741.22881: in VariableManager get_vars() 32935 1726853741.22921: Calling all_inventory to load vars for managed_node1 32935 1726853741.22924: Calling groups_inventory to load vars for managed_node1 32935 1726853741.22927: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853741.22937: Calling all_plugins_play to load vars for managed_node1 32935 1726853741.22939: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853741.22941: Calling groups_plugins_play to load vars for managed_node1 32935 1726853741.23948: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853741.25648: done with get_vars() 32935 1726853741.25673: done getting variables 32935 1726853741.25760: done queuing things up, now waiting for results queue to drain 32935 1726853741.25762: results queue empty 32935 1726853741.25763: checking for any_errors_fatal 32935 1726853741.25765: done checking for any_errors_fatal 32935 1726853741.25766: checking for max_fail_percentage 32935 1726853741.25767: done checking for max_fail_percentage 32935 1726853741.25768: checking to see if all hosts have failed and the running result is not ok 32935 1726853741.25769: done checking to see if all hosts have failed 32935 1726853741.25769: getting the remaining hosts for this loop 32935 1726853741.25770: done getting the remaining hosts for this loop 32935 1726853741.25775: getting the next task for host managed_node1 32935 1726853741.25779: done getting next task for host managed_node1 32935 1726853741.25782: ^ task is: TASK: Include the task 'manage_test_interface.yml' 32935 1726853741.25783: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853741.25786: getting variables 32935 1726853741.25787: in VariableManager get_vars() 32935 1726853741.25802: Calling all_inventory to load vars for managed_node1 32935 1726853741.25805: Calling groups_inventory to load vars for managed_node1 32935 1726853741.25807: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853741.25812: Calling all_plugins_play to load vars for managed_node1 32935 1726853741.25815: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853741.25817: Calling groups_plugins_play to load vars for managed_node1 32935 1726853741.26916: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853741.28248: done with get_vars() 32935 1726853741.28268: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_vlan_mtu.yml:73 Friday 20 September 2024 13:35:41 -0400 (0:00:00.373) 0:00:26.418 ****** 32935 1726853741.28324: entering _queue_task() for managed_node1/include_tasks 32935 1726853741.28701: worker is 1 (out of 1 available) 32935 1726853741.28717: exiting _queue_task() for managed_node1/include_tasks 32935 1726853741.28728: done queuing things up, now waiting for results queue to drain 32935 1726853741.28730: waiting for pending results... 32935 1726853741.28864: running TaskExecutor() for managed_node1/TASK: Include the task 'manage_test_interface.yml' 32935 1726853741.28937: in run() - task 02083763-bbaf-84df-441d-0000000000aa 32935 1726853741.28946: variable 'ansible_search_path' from source: unknown 32935 1726853741.28980: calling self._execute() 32935 1726853741.29055: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853741.29064: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853741.29067: variable 'omit' from source: magic vars 32935 1726853741.29350: variable 'ansible_distribution_major_version' from source: facts 32935 1726853741.29363: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853741.29368: _execute() done 32935 1726853741.29376: dumping result to json 32935 1726853741.29379: done dumping result, returning 32935 1726853741.29383: done running TaskExecutor() for managed_node1/TASK: Include the task 'manage_test_interface.yml' [02083763-bbaf-84df-441d-0000000000aa] 32935 1726853741.29385: sending task result for task 02083763-bbaf-84df-441d-0000000000aa 32935 1726853741.29477: done sending task result for task 02083763-bbaf-84df-441d-0000000000aa 32935 1726853741.29483: WORKER PROCESS EXITING 32935 1726853741.29521: no more pending results, returning what we have 32935 1726853741.29526: in VariableManager get_vars() 32935 1726853741.29576: Calling all_inventory to load vars for managed_node1 32935 1726853741.29579: Calling groups_inventory to load vars for managed_node1 32935 1726853741.29581: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853741.29595: Calling all_plugins_play to load vars for managed_node1 32935 1726853741.29598: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853741.29601: Calling groups_plugins_play to load vars for managed_node1 32935 1726853741.30748: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853741.32163: done with get_vars() 32935 1726853741.32181: variable 'ansible_search_path' from source: unknown 32935 1726853741.32193: we have included files to process 32935 1726853741.32194: generating all_blocks data 32935 1726853741.32196: done generating all_blocks data 32935 1726853741.32199: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 32935 1726853741.32200: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 32935 1726853741.32201: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 32935 1726853741.32456: in VariableManager get_vars() 32935 1726853741.32474: done with get_vars() 32935 1726853741.32890: done processing included file 32935 1726853741.32892: iterating over new_blocks loaded from include file 32935 1726853741.32894: in VariableManager get_vars() 32935 1726853741.32906: done with get_vars() 32935 1726853741.32907: filtering new block on tags 32935 1726853741.32926: done filtering new block on tags 32935 1726853741.32927: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed_node1 32935 1726853741.32931: extending task lists for all hosts with included blocks 32935 1726853741.34978: done extending task lists 32935 1726853741.34980: done processing included files 32935 1726853741.34980: results queue empty 32935 1726853741.34981: checking for any_errors_fatal 32935 1726853741.34982: done checking for any_errors_fatal 32935 1726853741.34983: checking for max_fail_percentage 32935 1726853741.34984: done checking for max_fail_percentage 32935 1726853741.34985: checking to see if all hosts have failed and the running result is not ok 32935 1726853741.34986: done checking to see if all hosts have failed 32935 1726853741.34987: getting the remaining hosts for this loop 32935 1726853741.34988: done getting the remaining hosts for this loop 32935 1726853741.34990: getting the next task for host managed_node1 32935 1726853741.34994: done getting next task for host managed_node1 32935 1726853741.34996: ^ task is: TASK: Ensure state in ["present", "absent"] 32935 1726853741.34999: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853741.35001: getting variables 32935 1726853741.35002: in VariableManager get_vars() 32935 1726853741.35017: Calling all_inventory to load vars for managed_node1 32935 1726853741.35019: Calling groups_inventory to load vars for managed_node1 32935 1726853741.35021: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853741.35027: Calling all_plugins_play to load vars for managed_node1 32935 1726853741.35031: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853741.35033: Calling groups_plugins_play to load vars for managed_node1 32935 1726853741.36127: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853741.37692: done with get_vars() 32935 1726853741.37711: done getting variables 32935 1726853741.37753: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Friday 20 September 2024 13:35:41 -0400 (0:00:00.094) 0:00:26.513 ****** 32935 1726853741.37783: entering _queue_task() for managed_node1/fail 32935 1726853741.38133: worker is 1 (out of 1 available) 32935 1726853741.38147: exiting _queue_task() for managed_node1/fail 32935 1726853741.38161: done queuing things up, now waiting for results queue to drain 32935 1726853741.38163: waiting for pending results... 32935 1726853741.38489: running TaskExecutor() for managed_node1/TASK: Ensure state in ["present", "absent"] 32935 1726853741.38509: in run() - task 02083763-bbaf-84df-441d-00000000093c 32935 1726853741.38526: variable 'ansible_search_path' from source: unknown 32935 1726853741.38533: variable 'ansible_search_path' from source: unknown 32935 1726853741.38584: calling self._execute() 32935 1726853741.38690: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853741.38700: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853741.38719: variable 'omit' from source: magic vars 32935 1726853741.39111: variable 'ansible_distribution_major_version' from source: facts 32935 1726853741.39150: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853741.39281: variable 'state' from source: include params 32935 1726853741.39293: Evaluated conditional (state not in ["present", "absent"]): False 32935 1726853741.39301: when evaluation is False, skipping this task 32935 1726853741.39368: _execute() done 32935 1726853741.39373: dumping result to json 32935 1726853741.39376: done dumping result, returning 32935 1726853741.39378: done running TaskExecutor() for managed_node1/TASK: Ensure state in ["present", "absent"] [02083763-bbaf-84df-441d-00000000093c] 32935 1726853741.39380: sending task result for task 02083763-bbaf-84df-441d-00000000093c 32935 1726853741.39447: done sending task result for task 02083763-bbaf-84df-441d-00000000093c 32935 1726853741.39450: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 32935 1726853741.39516: no more pending results, returning what we have 32935 1726853741.39520: results queue empty 32935 1726853741.39521: checking for any_errors_fatal 32935 1726853741.39523: done checking for any_errors_fatal 32935 1726853741.39523: checking for max_fail_percentage 32935 1726853741.39525: done checking for max_fail_percentage 32935 1726853741.39526: checking to see if all hosts have failed and the running result is not ok 32935 1726853741.39527: done checking to see if all hosts have failed 32935 1726853741.39528: getting the remaining hosts for this loop 32935 1726853741.39529: done getting the remaining hosts for this loop 32935 1726853741.39532: getting the next task for host managed_node1 32935 1726853741.39540: done getting next task for host managed_node1 32935 1726853741.39542: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 32935 1726853741.39545: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853741.39548: getting variables 32935 1726853741.39550: in VariableManager get_vars() 32935 1726853741.39591: Calling all_inventory to load vars for managed_node1 32935 1726853741.39594: Calling groups_inventory to load vars for managed_node1 32935 1726853741.39596: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853741.39610: Calling all_plugins_play to load vars for managed_node1 32935 1726853741.39613: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853741.39615: Calling groups_plugins_play to load vars for managed_node1 32935 1726853741.41116: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853741.42964: done with get_vars() 32935 1726853741.42989: done getting variables 32935 1726853741.43079: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Friday 20 September 2024 13:35:41 -0400 (0:00:00.053) 0:00:26.566 ****** 32935 1726853741.43107: entering _queue_task() for managed_node1/fail 32935 1726853741.43391: worker is 1 (out of 1 available) 32935 1726853741.43404: exiting _queue_task() for managed_node1/fail 32935 1726853741.43417: done queuing things up, now waiting for results queue to drain 32935 1726853741.43418: waiting for pending results... 32935 1726853741.43592: running TaskExecutor() for managed_node1/TASK: Ensure type in ["dummy", "tap", "veth"] 32935 1726853741.43666: in run() - task 02083763-bbaf-84df-441d-00000000093d 32935 1726853741.43675: variable 'ansible_search_path' from source: unknown 32935 1726853741.43679: variable 'ansible_search_path' from source: unknown 32935 1726853741.43707: calling self._execute() 32935 1726853741.43782: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853741.43786: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853741.43796: variable 'omit' from source: magic vars 32935 1726853741.44073: variable 'ansible_distribution_major_version' from source: facts 32935 1726853741.44089: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853741.44176: variable 'type' from source: play vars 32935 1726853741.44183: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 32935 1726853741.44188: when evaluation is False, skipping this task 32935 1726853741.44191: _execute() done 32935 1726853741.44193: dumping result to json 32935 1726853741.44196: done dumping result, returning 32935 1726853741.44206: done running TaskExecutor() for managed_node1/TASK: Ensure type in ["dummy", "tap", "veth"] [02083763-bbaf-84df-441d-00000000093d] 32935 1726853741.44208: sending task result for task 02083763-bbaf-84df-441d-00000000093d 32935 1726853741.44288: done sending task result for task 02083763-bbaf-84df-441d-00000000093d 32935 1726853741.44291: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 32935 1726853741.44346: no more pending results, returning what we have 32935 1726853741.44349: results queue empty 32935 1726853741.44351: checking for any_errors_fatal 32935 1726853741.44357: done checking for any_errors_fatal 32935 1726853741.44360: checking for max_fail_percentage 32935 1726853741.44362: done checking for max_fail_percentage 32935 1726853741.44363: checking to see if all hosts have failed and the running result is not ok 32935 1726853741.44364: done checking to see if all hosts have failed 32935 1726853741.44364: getting the remaining hosts for this loop 32935 1726853741.44366: done getting the remaining hosts for this loop 32935 1726853741.44369: getting the next task for host managed_node1 32935 1726853741.44378: done getting next task for host managed_node1 32935 1726853741.44381: ^ task is: TASK: Include the task 'show_interfaces.yml' 32935 1726853741.44384: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853741.44387: getting variables 32935 1726853741.44389: in VariableManager get_vars() 32935 1726853741.44424: Calling all_inventory to load vars for managed_node1 32935 1726853741.44427: Calling groups_inventory to load vars for managed_node1 32935 1726853741.44429: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853741.44438: Calling all_plugins_play to load vars for managed_node1 32935 1726853741.44440: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853741.44443: Calling groups_plugins_play to load vars for managed_node1 32935 1726853741.45630: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853741.46642: done with get_vars() 32935 1726853741.46657: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Friday 20 September 2024 13:35:41 -0400 (0:00:00.036) 0:00:26.602 ****** 32935 1726853741.46725: entering _queue_task() for managed_node1/include_tasks 32935 1726853741.46955: worker is 1 (out of 1 available) 32935 1726853741.46973: exiting _queue_task() for managed_node1/include_tasks 32935 1726853741.46988: done queuing things up, now waiting for results queue to drain 32935 1726853741.46990: waiting for pending results... 32935 1726853741.47154: running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' 32935 1726853741.47222: in run() - task 02083763-bbaf-84df-441d-00000000093e 32935 1726853741.47230: variable 'ansible_search_path' from source: unknown 32935 1726853741.47233: variable 'ansible_search_path' from source: unknown 32935 1726853741.47265: calling self._execute() 32935 1726853741.47340: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853741.47343: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853741.47354: variable 'omit' from source: magic vars 32935 1726853741.47632: variable 'ansible_distribution_major_version' from source: facts 32935 1726853741.47642: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853741.47653: _execute() done 32935 1726853741.47661: dumping result to json 32935 1726853741.47665: done dumping result, returning 32935 1726853741.47667: done running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' [02083763-bbaf-84df-441d-00000000093e] 32935 1726853741.47669: sending task result for task 02083763-bbaf-84df-441d-00000000093e 32935 1726853741.47744: done sending task result for task 02083763-bbaf-84df-441d-00000000093e 32935 1726853741.47746: WORKER PROCESS EXITING 32935 1726853741.47785: no more pending results, returning what we have 32935 1726853741.47789: in VariableManager get_vars() 32935 1726853741.47833: Calling all_inventory to load vars for managed_node1 32935 1726853741.47837: Calling groups_inventory to load vars for managed_node1 32935 1726853741.47840: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853741.47852: Calling all_plugins_play to load vars for managed_node1 32935 1726853741.47854: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853741.47857: Calling groups_plugins_play to load vars for managed_node1 32935 1726853741.49187: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853741.50245: done with get_vars() 32935 1726853741.50262: variable 'ansible_search_path' from source: unknown 32935 1726853741.50263: variable 'ansible_search_path' from source: unknown 32935 1726853741.50291: we have included files to process 32935 1726853741.50292: generating all_blocks data 32935 1726853741.50293: done generating all_blocks data 32935 1726853741.50297: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 32935 1726853741.50297: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 32935 1726853741.50299: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 32935 1726853741.50369: in VariableManager get_vars() 32935 1726853741.50392: done with get_vars() 32935 1726853741.50469: done processing included file 32935 1726853741.50473: iterating over new_blocks loaded from include file 32935 1726853741.50475: in VariableManager get_vars() 32935 1726853741.50490: done with get_vars() 32935 1726853741.50491: filtering new block on tags 32935 1726853741.50503: done filtering new block on tags 32935 1726853741.50504: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node1 32935 1726853741.50508: extending task lists for all hosts with included blocks 32935 1726853741.50733: done extending task lists 32935 1726853741.50734: done processing included files 32935 1726853741.50735: results queue empty 32935 1726853741.50735: checking for any_errors_fatal 32935 1726853741.50738: done checking for any_errors_fatal 32935 1726853741.50738: checking for max_fail_percentage 32935 1726853741.50739: done checking for max_fail_percentage 32935 1726853741.50739: checking to see if all hosts have failed and the running result is not ok 32935 1726853741.50740: done checking to see if all hosts have failed 32935 1726853741.50741: getting the remaining hosts for this loop 32935 1726853741.50741: done getting the remaining hosts for this loop 32935 1726853741.50743: getting the next task for host managed_node1 32935 1726853741.50746: done getting next task for host managed_node1 32935 1726853741.50747: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 32935 1726853741.50749: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853741.50751: getting variables 32935 1726853741.50752: in VariableManager get_vars() 32935 1726853741.50763: Calling all_inventory to load vars for managed_node1 32935 1726853741.50765: Calling groups_inventory to load vars for managed_node1 32935 1726853741.50766: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853741.50770: Calling all_plugins_play to load vars for managed_node1 32935 1726853741.50773: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853741.50775: Calling groups_plugins_play to load vars for managed_node1 32935 1726853741.51613: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853741.52991: done with get_vars() 32935 1726853741.53011: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 13:35:41 -0400 (0:00:00.063) 0:00:26.666 ****** 32935 1726853741.53102: entering _queue_task() for managed_node1/include_tasks 32935 1726853741.53463: worker is 1 (out of 1 available) 32935 1726853741.53476: exiting _queue_task() for managed_node1/include_tasks 32935 1726853741.53490: done queuing things up, now waiting for results queue to drain 32935 1726853741.53492: waiting for pending results... 32935 1726853741.53818: running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' 32935 1726853741.53931: in run() - task 02083763-bbaf-84df-441d-000000000aa0 32935 1726853741.53944: variable 'ansible_search_path' from source: unknown 32935 1726853741.53948: variable 'ansible_search_path' from source: unknown 32935 1726853741.53978: calling self._execute() 32935 1726853741.54061: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853741.54065: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853741.54074: variable 'omit' from source: magic vars 32935 1726853741.54355: variable 'ansible_distribution_major_version' from source: facts 32935 1726853741.54367: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853741.54375: _execute() done 32935 1726853741.54378: dumping result to json 32935 1726853741.54381: done dumping result, returning 32935 1726853741.54388: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' [02083763-bbaf-84df-441d-000000000aa0] 32935 1726853741.54393: sending task result for task 02083763-bbaf-84df-441d-000000000aa0 32935 1726853741.54475: done sending task result for task 02083763-bbaf-84df-441d-000000000aa0 32935 1726853741.54479: WORKER PROCESS EXITING 32935 1726853741.54504: no more pending results, returning what we have 32935 1726853741.54509: in VariableManager get_vars() 32935 1726853741.54553: Calling all_inventory to load vars for managed_node1 32935 1726853741.54555: Calling groups_inventory to load vars for managed_node1 32935 1726853741.54560: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853741.54575: Calling all_plugins_play to load vars for managed_node1 32935 1726853741.54578: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853741.54581: Calling groups_plugins_play to load vars for managed_node1 32935 1726853741.55351: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853741.56744: done with get_vars() 32935 1726853741.56756: variable 'ansible_search_path' from source: unknown 32935 1726853741.56757: variable 'ansible_search_path' from source: unknown 32935 1726853741.56798: we have included files to process 32935 1726853741.56799: generating all_blocks data 32935 1726853741.56800: done generating all_blocks data 32935 1726853741.56801: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 32935 1726853741.56801: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 32935 1726853741.56803: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 32935 1726853741.56991: done processing included file 32935 1726853741.56992: iterating over new_blocks loaded from include file 32935 1726853741.56993: in VariableManager get_vars() 32935 1726853741.57006: done with get_vars() 32935 1726853741.57007: filtering new block on tags 32935 1726853741.57018: done filtering new block on tags 32935 1726853741.57019: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node1 32935 1726853741.57022: extending task lists for all hosts with included blocks 32935 1726853741.57111: done extending task lists 32935 1726853741.57112: done processing included files 32935 1726853741.57112: results queue empty 32935 1726853741.57113: checking for any_errors_fatal 32935 1726853741.57115: done checking for any_errors_fatal 32935 1726853741.57116: checking for max_fail_percentage 32935 1726853741.57117: done checking for max_fail_percentage 32935 1726853741.57117: checking to see if all hosts have failed and the running result is not ok 32935 1726853741.57117: done checking to see if all hosts have failed 32935 1726853741.57118: getting the remaining hosts for this loop 32935 1726853741.57119: done getting the remaining hosts for this loop 32935 1726853741.57120: getting the next task for host managed_node1 32935 1726853741.57123: done getting next task for host managed_node1 32935 1726853741.57125: ^ task is: TASK: Gather current interface info 32935 1726853741.57127: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853741.57128: getting variables 32935 1726853741.57129: in VariableManager get_vars() 32935 1726853741.57137: Calling all_inventory to load vars for managed_node1 32935 1726853741.57139: Calling groups_inventory to load vars for managed_node1 32935 1726853741.57140: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853741.57143: Calling all_plugins_play to load vars for managed_node1 32935 1726853741.57145: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853741.57146: Calling groups_plugins_play to load vars for managed_node1 32935 1726853741.57774: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853741.58617: done with get_vars() 32935 1726853741.58631: done getting variables 32935 1726853741.58659: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 13:35:41 -0400 (0:00:00.055) 0:00:26.722 ****** 32935 1726853741.58685: entering _queue_task() for managed_node1/command 32935 1726853741.58925: worker is 1 (out of 1 available) 32935 1726853741.58940: exiting _queue_task() for managed_node1/command 32935 1726853741.58952: done queuing things up, now waiting for results queue to drain 32935 1726853741.58954: waiting for pending results... 32935 1726853741.59134: running TaskExecutor() for managed_node1/TASK: Gather current interface info 32935 1726853741.59210: in run() - task 02083763-bbaf-84df-441d-000000000ad7 32935 1726853741.59221: variable 'ansible_search_path' from source: unknown 32935 1726853741.59226: variable 'ansible_search_path' from source: unknown 32935 1726853741.59253: calling self._execute() 32935 1726853741.59332: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853741.59336: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853741.59345: variable 'omit' from source: magic vars 32935 1726853741.59629: variable 'ansible_distribution_major_version' from source: facts 32935 1726853741.59639: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853741.59645: variable 'omit' from source: magic vars 32935 1726853741.59681: variable 'omit' from source: magic vars 32935 1726853741.59709: variable 'omit' from source: magic vars 32935 1726853741.59744: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32935 1726853741.59775: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32935 1726853741.59791: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32935 1726853741.59804: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853741.59813: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853741.59839: variable 'inventory_hostname' from source: host vars for 'managed_node1' 32935 1726853741.59842: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853741.59846: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853741.59918: Set connection var ansible_timeout to 10 32935 1726853741.59923: Set connection var ansible_shell_type to sh 32935 1726853741.59930: Set connection var ansible_pipelining to False 32935 1726853741.59934: Set connection var ansible_connection to ssh 32935 1726853741.59936: Set connection var ansible_shell_executable to /bin/sh 32935 1726853741.59947: Set connection var ansible_module_compression to ZIP_DEFLATED 32935 1726853741.59964: variable 'ansible_shell_executable' from source: unknown 32935 1726853741.59967: variable 'ansible_connection' from source: unknown 32935 1726853741.59970: variable 'ansible_module_compression' from source: unknown 32935 1726853741.59974: variable 'ansible_shell_type' from source: unknown 32935 1726853741.59976: variable 'ansible_shell_executable' from source: unknown 32935 1726853741.59978: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853741.59982: variable 'ansible_pipelining' from source: unknown 32935 1726853741.59984: variable 'ansible_timeout' from source: unknown 32935 1726853741.59989: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853741.60096: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32935 1726853741.60106: variable 'omit' from source: magic vars 32935 1726853741.60111: starting attempt loop 32935 1726853741.60114: running the handler 32935 1726853741.60127: _low_level_execute_command(): starting 32935 1726853741.60134: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32935 1726853741.60657: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853741.60661: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853741.60665: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853741.60667: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853741.60723: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853741.60726: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853741.60728: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853741.60784: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853741.62478: stdout chunk (state=3): >>>/root <<< 32935 1726853741.62581: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853741.62614: stderr chunk (state=3): >>><<< 32935 1726853741.62616: stdout chunk (state=3): >>><<< 32935 1726853741.62630: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853741.62677: _low_level_execute_command(): starting 32935 1726853741.62681: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853741.6263576-34183-144006588443682 `" && echo ansible-tmp-1726853741.6263576-34183-144006588443682="` echo /root/.ansible/tmp/ansible-tmp-1726853741.6263576-34183-144006588443682 `" ) && sleep 0' 32935 1726853741.63073: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32935 1726853741.63079: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853741.63112: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32935 1726853741.63115: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853741.63126: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853741.63128: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 32935 1726853741.63130: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853741.63180: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853741.63188: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853741.63226: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853741.65110: stdout chunk (state=3): >>>ansible-tmp-1726853741.6263576-34183-144006588443682=/root/.ansible/tmp/ansible-tmp-1726853741.6263576-34183-144006588443682 <<< 32935 1726853741.65212: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853741.65236: stderr chunk (state=3): >>><<< 32935 1726853741.65239: stdout chunk (state=3): >>><<< 32935 1726853741.65254: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853741.6263576-34183-144006588443682=/root/.ansible/tmp/ansible-tmp-1726853741.6263576-34183-144006588443682 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853741.65282: variable 'ansible_module_compression' from source: unknown 32935 1726853741.65323: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-32935vj31k4ae/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 32935 1726853741.65348: variable 'ansible_facts' from source: unknown 32935 1726853741.65405: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853741.6263576-34183-144006588443682/AnsiballZ_command.py 32935 1726853741.65501: Sending initial data 32935 1726853741.65504: Sent initial data (156 bytes) 32935 1726853741.65945: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853741.65949: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 32935 1726853741.65951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853741.65953: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853741.65955: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853741.66010: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853741.66014: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853741.66017: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853741.66054: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853741.67594: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 32935 1726853741.67603: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32935 1726853741.67632: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32935 1726853741.67673: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32935vj31k4ae/tmp8dzhc7hr /root/.ansible/tmp/ansible-tmp-1726853741.6263576-34183-144006588443682/AnsiballZ_command.py <<< 32935 1726853741.67685: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853741.6263576-34183-144006588443682/AnsiballZ_command.py" <<< 32935 1726853741.67714: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-32935vj31k4ae/tmp8dzhc7hr" to remote "/root/.ansible/tmp/ansible-tmp-1726853741.6263576-34183-144006588443682/AnsiballZ_command.py" <<< 32935 1726853741.67716: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853741.6263576-34183-144006588443682/AnsiballZ_command.py" <<< 32935 1726853741.68238: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853741.68284: stderr chunk (state=3): >>><<< 32935 1726853741.68287: stdout chunk (state=3): >>><<< 32935 1726853741.68325: done transferring module to remote 32935 1726853741.68339: _low_level_execute_command(): starting 32935 1726853741.68343: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853741.6263576-34183-144006588443682/ /root/.ansible/tmp/ansible-tmp-1726853741.6263576-34183-144006588443682/AnsiballZ_command.py && sleep 0' 32935 1726853741.68767: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853741.68804: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853741.68807: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853741.68809: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853741.68815: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853741.68863: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853741.68866: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853741.68869: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853741.68911: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853741.70645: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853741.70670: stderr chunk (state=3): >>><<< 32935 1726853741.70675: stdout chunk (state=3): >>><<< 32935 1726853741.70689: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853741.70692: _low_level_execute_command(): starting 32935 1726853741.70696: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853741.6263576-34183-144006588443682/AnsiballZ_command.py && sleep 0' 32935 1726853741.71125: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853741.71128: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 32935 1726853741.71131: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 32935 1726853741.71133: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32935 1726853741.71135: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853741.71189: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853741.71193: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853741.71199: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853741.71243: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853741.86585: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo\nlsr101\npeerlsr101", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 13:35:41.861337", "end": "2024-09-20 13:35:41.864734", "delta": "0:00:00.003397", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 32935 1726853741.88095: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 32935 1726853741.88127: stderr chunk (state=3): >>><<< 32935 1726853741.88130: stdout chunk (state=3): >>><<< 32935 1726853741.88144: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo\nlsr101\npeerlsr101", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 13:35:41.861337", "end": "2024-09-20 13:35:41.864734", "delta": "0:00:00.003397", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 32935 1726853741.88177: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853741.6263576-34183-144006588443682/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32935 1726853741.88183: _low_level_execute_command(): starting 32935 1726853741.88188: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853741.6263576-34183-144006588443682/ > /dev/null 2>&1 && sleep 0' 32935 1726853741.88639: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853741.88642: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 32935 1726853741.88649: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853741.88651: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853741.88653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853741.88707: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853741.88710: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853741.88714: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853741.88752: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853741.90548: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853741.90574: stderr chunk (state=3): >>><<< 32935 1726853741.90577: stdout chunk (state=3): >>><<< 32935 1726853741.90590: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853741.90598: handler run complete 32935 1726853741.90617: Evaluated conditional (False): False 32935 1726853741.90626: attempt loop complete, returning result 32935 1726853741.90629: _execute() done 32935 1726853741.90631: dumping result to json 32935 1726853741.90637: done dumping result, returning 32935 1726853741.90644: done running TaskExecutor() for managed_node1/TASK: Gather current interface info [02083763-bbaf-84df-441d-000000000ad7] 32935 1726853741.90647: sending task result for task 02083763-bbaf-84df-441d-000000000ad7 32935 1726853741.90745: done sending task result for task 02083763-bbaf-84df-441d-000000000ad7 32935 1726853741.90748: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003397", "end": "2024-09-20 13:35:41.864734", "rc": 0, "start": "2024-09-20 13:35:41.861337" } STDOUT: bonding_masters eth0 lo lsr101 peerlsr101 32935 1726853741.90817: no more pending results, returning what we have 32935 1726853741.90820: results queue empty 32935 1726853741.90821: checking for any_errors_fatal 32935 1726853741.90822: done checking for any_errors_fatal 32935 1726853741.90823: checking for max_fail_percentage 32935 1726853741.90825: done checking for max_fail_percentage 32935 1726853741.90826: checking to see if all hosts have failed and the running result is not ok 32935 1726853741.90827: done checking to see if all hosts have failed 32935 1726853741.90828: getting the remaining hosts for this loop 32935 1726853741.90829: done getting the remaining hosts for this loop 32935 1726853741.90833: getting the next task for host managed_node1 32935 1726853741.90841: done getting next task for host managed_node1 32935 1726853741.90843: ^ task is: TASK: Set current_interfaces 32935 1726853741.90848: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853741.90853: getting variables 32935 1726853741.90854: in VariableManager get_vars() 32935 1726853741.90897: Calling all_inventory to load vars for managed_node1 32935 1726853741.90900: Calling groups_inventory to load vars for managed_node1 32935 1726853741.90902: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853741.90913: Calling all_plugins_play to load vars for managed_node1 32935 1726853741.90916: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853741.90918: Calling groups_plugins_play to load vars for managed_node1 32935 1726853741.91791: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853741.95854: done with get_vars() 32935 1726853741.95876: done getting variables 32935 1726853741.95912: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 13:35:41 -0400 (0:00:00.372) 0:00:27.094 ****** 32935 1726853741.95931: entering _queue_task() for managed_node1/set_fact 32935 1726853741.96196: worker is 1 (out of 1 available) 32935 1726853741.96210: exiting _queue_task() for managed_node1/set_fact 32935 1726853741.96224: done queuing things up, now waiting for results queue to drain 32935 1726853741.96227: waiting for pending results... 32935 1726853741.96410: running TaskExecutor() for managed_node1/TASK: Set current_interfaces 32935 1726853741.96500: in run() - task 02083763-bbaf-84df-441d-000000000ad8 32935 1726853741.96511: variable 'ansible_search_path' from source: unknown 32935 1726853741.96514: variable 'ansible_search_path' from source: unknown 32935 1726853741.96542: calling self._execute() 32935 1726853741.96626: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853741.96631: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853741.96638: variable 'omit' from source: magic vars 32935 1726853741.96929: variable 'ansible_distribution_major_version' from source: facts 32935 1726853741.96939: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853741.96945: variable 'omit' from source: magic vars 32935 1726853741.96986: variable 'omit' from source: magic vars 32935 1726853741.97061: variable '_current_interfaces' from source: set_fact 32935 1726853741.97117: variable 'omit' from source: magic vars 32935 1726853741.97150: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32935 1726853741.97183: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32935 1726853741.97199: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32935 1726853741.97216: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853741.97225: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853741.97248: variable 'inventory_hostname' from source: host vars for 'managed_node1' 32935 1726853741.97251: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853741.97254: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853741.97323: Set connection var ansible_timeout to 10 32935 1726853741.97329: Set connection var ansible_shell_type to sh 32935 1726853741.97337: Set connection var ansible_pipelining to False 32935 1726853741.97340: Set connection var ansible_connection to ssh 32935 1726853741.97345: Set connection var ansible_shell_executable to /bin/sh 32935 1726853741.97351: Set connection var ansible_module_compression to ZIP_DEFLATED 32935 1726853741.97373: variable 'ansible_shell_executable' from source: unknown 32935 1726853741.97376: variable 'ansible_connection' from source: unknown 32935 1726853741.97379: variable 'ansible_module_compression' from source: unknown 32935 1726853741.97381: variable 'ansible_shell_type' from source: unknown 32935 1726853741.97384: variable 'ansible_shell_executable' from source: unknown 32935 1726853741.97386: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853741.97388: variable 'ansible_pipelining' from source: unknown 32935 1726853741.97390: variable 'ansible_timeout' from source: unknown 32935 1726853741.97396: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853741.97499: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32935 1726853741.97509: variable 'omit' from source: magic vars 32935 1726853741.97514: starting attempt loop 32935 1726853741.97517: running the handler 32935 1726853741.97526: handler run complete 32935 1726853741.97535: attempt loop complete, returning result 32935 1726853741.97537: _execute() done 32935 1726853741.97540: dumping result to json 32935 1726853741.97542: done dumping result, returning 32935 1726853741.97555: done running TaskExecutor() for managed_node1/TASK: Set current_interfaces [02083763-bbaf-84df-441d-000000000ad8] 32935 1726853741.97557: sending task result for task 02083763-bbaf-84df-441d-000000000ad8 32935 1726853741.97631: done sending task result for task 02083763-bbaf-84df-441d-000000000ad8 32935 1726853741.97634: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo", "lsr101", "peerlsr101" ] }, "changed": false } 32935 1726853741.97710: no more pending results, returning what we have 32935 1726853741.97714: results queue empty 32935 1726853741.97715: checking for any_errors_fatal 32935 1726853741.97725: done checking for any_errors_fatal 32935 1726853741.97725: checking for max_fail_percentage 32935 1726853741.97727: done checking for max_fail_percentage 32935 1726853741.97728: checking to see if all hosts have failed and the running result is not ok 32935 1726853741.97729: done checking to see if all hosts have failed 32935 1726853741.97730: getting the remaining hosts for this loop 32935 1726853741.97732: done getting the remaining hosts for this loop 32935 1726853741.97735: getting the next task for host managed_node1 32935 1726853741.97744: done getting next task for host managed_node1 32935 1726853741.97747: ^ task is: TASK: Show current_interfaces 32935 1726853741.97750: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853741.97754: getting variables 32935 1726853741.97756: in VariableManager get_vars() 32935 1726853741.97802: Calling all_inventory to load vars for managed_node1 32935 1726853741.97805: Calling groups_inventory to load vars for managed_node1 32935 1726853741.97807: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853741.97816: Calling all_plugins_play to load vars for managed_node1 32935 1726853741.97819: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853741.97821: Calling groups_plugins_play to load vars for managed_node1 32935 1726853741.98578: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853741.99440: done with get_vars() 32935 1726853741.99455: done getting variables 32935 1726853741.99498: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 13:35:41 -0400 (0:00:00.035) 0:00:27.130 ****** 32935 1726853741.99522: entering _queue_task() for managed_node1/debug 32935 1726853741.99752: worker is 1 (out of 1 available) 32935 1726853741.99765: exiting _queue_task() for managed_node1/debug 32935 1726853741.99781: done queuing things up, now waiting for results queue to drain 32935 1726853741.99783: waiting for pending results... 32935 1726853741.99955: running TaskExecutor() for managed_node1/TASK: Show current_interfaces 32935 1726853742.00040: in run() - task 02083763-bbaf-84df-441d-000000000aa1 32935 1726853742.00050: variable 'ansible_search_path' from source: unknown 32935 1726853742.00053: variable 'ansible_search_path' from source: unknown 32935 1726853742.00089: calling self._execute() 32935 1726853742.00172: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853742.00176: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853742.00185: variable 'omit' from source: magic vars 32935 1726853742.00466: variable 'ansible_distribution_major_version' from source: facts 32935 1726853742.00477: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853742.00483: variable 'omit' from source: magic vars 32935 1726853742.00513: variable 'omit' from source: magic vars 32935 1726853742.00583: variable 'current_interfaces' from source: set_fact 32935 1726853742.00604: variable 'omit' from source: magic vars 32935 1726853742.00634: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32935 1726853742.00661: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32935 1726853742.00688: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32935 1726853742.00701: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853742.00710: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853742.00734: variable 'inventory_hostname' from source: host vars for 'managed_node1' 32935 1726853742.00737: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853742.00740: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853742.00811: Set connection var ansible_timeout to 10 32935 1726853742.00815: Set connection var ansible_shell_type to sh 32935 1726853742.00822: Set connection var ansible_pipelining to False 32935 1726853742.00825: Set connection var ansible_connection to ssh 32935 1726853742.00829: Set connection var ansible_shell_executable to /bin/sh 32935 1726853742.00835: Set connection var ansible_module_compression to ZIP_DEFLATED 32935 1726853742.00852: variable 'ansible_shell_executable' from source: unknown 32935 1726853742.00856: variable 'ansible_connection' from source: unknown 32935 1726853742.00860: variable 'ansible_module_compression' from source: unknown 32935 1726853742.00863: variable 'ansible_shell_type' from source: unknown 32935 1726853742.00866: variable 'ansible_shell_executable' from source: unknown 32935 1726853742.00868: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853742.00870: variable 'ansible_pipelining' from source: unknown 32935 1726853742.00874: variable 'ansible_timeout' from source: unknown 32935 1726853742.00876: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853742.00974: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32935 1726853742.00983: variable 'omit' from source: magic vars 32935 1726853742.00990: starting attempt loop 32935 1726853742.00993: running the handler 32935 1726853742.01031: handler run complete 32935 1726853742.01042: attempt loop complete, returning result 32935 1726853742.01045: _execute() done 32935 1726853742.01048: dumping result to json 32935 1726853742.01051: done dumping result, returning 32935 1726853742.01057: done running TaskExecutor() for managed_node1/TASK: Show current_interfaces [02083763-bbaf-84df-441d-000000000aa1] 32935 1726853742.01062: sending task result for task 02083763-bbaf-84df-441d-000000000aa1 32935 1726853742.01138: done sending task result for task 02083763-bbaf-84df-441d-000000000aa1 32935 1726853742.01140: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo', 'lsr101', 'peerlsr101'] 32935 1726853742.01187: no more pending results, returning what we have 32935 1726853742.01190: results queue empty 32935 1726853742.01191: checking for any_errors_fatal 32935 1726853742.01198: done checking for any_errors_fatal 32935 1726853742.01198: checking for max_fail_percentage 32935 1726853742.01200: done checking for max_fail_percentage 32935 1726853742.01201: checking to see if all hosts have failed and the running result is not ok 32935 1726853742.01202: done checking to see if all hosts have failed 32935 1726853742.01203: getting the remaining hosts for this loop 32935 1726853742.01204: done getting the remaining hosts for this loop 32935 1726853742.01208: getting the next task for host managed_node1 32935 1726853742.01217: done getting next task for host managed_node1 32935 1726853742.01219: ^ task is: TASK: Install iproute 32935 1726853742.01221: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853742.01226: getting variables 32935 1726853742.01228: in VariableManager get_vars() 32935 1726853742.01268: Calling all_inventory to load vars for managed_node1 32935 1726853742.01272: Calling groups_inventory to load vars for managed_node1 32935 1726853742.01275: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853742.01285: Calling all_plugins_play to load vars for managed_node1 32935 1726853742.01287: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853742.01289: Calling groups_plugins_play to load vars for managed_node1 32935 1726853742.02375: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853742.03229: done with get_vars() 32935 1726853742.03243: done getting variables 32935 1726853742.03288: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Friday 20 September 2024 13:35:42 -0400 (0:00:00.037) 0:00:27.168 ****** 32935 1726853742.03311: entering _queue_task() for managed_node1/package 32935 1726853742.03599: worker is 1 (out of 1 available) 32935 1726853742.03613: exiting _queue_task() for managed_node1/package 32935 1726853742.03626: done queuing things up, now waiting for results queue to drain 32935 1726853742.03628: waiting for pending results... 32935 1726853742.03998: running TaskExecutor() for managed_node1/TASK: Install iproute 32935 1726853742.04014: in run() - task 02083763-bbaf-84df-441d-00000000093f 32935 1726853742.04032: variable 'ansible_search_path' from source: unknown 32935 1726853742.04038: variable 'ansible_search_path' from source: unknown 32935 1726853742.04093: calling self._execute() 32935 1726853742.04186: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853742.04200: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853742.04276: variable 'omit' from source: magic vars 32935 1726853742.04605: variable 'ansible_distribution_major_version' from source: facts 32935 1726853742.04622: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853742.04638: variable 'omit' from source: magic vars 32935 1726853742.04683: variable 'omit' from source: magic vars 32935 1726853742.04877: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 32935 1726853742.06980: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 32935 1726853742.07032: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 32935 1726853742.07068: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 32935 1726853742.07092: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 32935 1726853742.07113: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 32935 1726853742.07188: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 32935 1726853742.07207: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 32935 1726853742.07224: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 32935 1726853742.07249: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 32935 1726853742.07264: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 32935 1726853742.07337: variable '__network_is_ostree' from source: set_fact 32935 1726853742.07341: variable 'omit' from source: magic vars 32935 1726853742.07367: variable 'omit' from source: magic vars 32935 1726853742.07389: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32935 1726853742.07415: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32935 1726853742.07429: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32935 1726853742.07443: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853742.07451: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853742.07478: variable 'inventory_hostname' from source: host vars for 'managed_node1' 32935 1726853742.07481: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853742.07483: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853742.07550: Set connection var ansible_timeout to 10 32935 1726853742.07555: Set connection var ansible_shell_type to sh 32935 1726853742.07575: Set connection var ansible_pipelining to False 32935 1726853742.07580: Set connection var ansible_connection to ssh 32935 1726853742.07582: Set connection var ansible_shell_executable to /bin/sh 32935 1726853742.07585: Set connection var ansible_module_compression to ZIP_DEFLATED 32935 1726853742.07594: variable 'ansible_shell_executable' from source: unknown 32935 1726853742.07598: variable 'ansible_connection' from source: unknown 32935 1726853742.07601: variable 'ansible_module_compression' from source: unknown 32935 1726853742.07603: variable 'ansible_shell_type' from source: unknown 32935 1726853742.07605: variable 'ansible_shell_executable' from source: unknown 32935 1726853742.07609: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853742.07611: variable 'ansible_pipelining' from source: unknown 32935 1726853742.07613: variable 'ansible_timeout' from source: unknown 32935 1726853742.07615: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853742.07684: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32935 1726853742.07694: variable 'omit' from source: magic vars 32935 1726853742.07697: starting attempt loop 32935 1726853742.07700: running the handler 32935 1726853742.07708: variable 'ansible_facts' from source: unknown 32935 1726853742.07710: variable 'ansible_facts' from source: unknown 32935 1726853742.07737: _low_level_execute_command(): starting 32935 1726853742.07744: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32935 1726853742.08232: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853742.08237: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853742.08240: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853742.08242: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853742.08293: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853742.08296: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853742.08351: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853742.10049: stdout chunk (state=3): >>>/root <<< 32935 1726853742.10151: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853742.10200: stderr chunk (state=3): >>><<< 32935 1726853742.10203: stdout chunk (state=3): >>><<< 32935 1726853742.10220: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853742.10319: _low_level_execute_command(): starting 32935 1726853742.10323: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853742.1023326-34193-223802705881754 `" && echo ansible-tmp-1726853742.1023326-34193-223802705881754="` echo /root/.ansible/tmp/ansible-tmp-1726853742.1023326-34193-223802705881754 `" ) && sleep 0' 32935 1726853742.10839: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32935 1726853742.10852: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853742.10869: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853742.10888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32935 1726853742.10904: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 32935 1726853742.10992: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853742.11019: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853742.11034: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853742.11097: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853742.13076: stdout chunk (state=3): >>>ansible-tmp-1726853742.1023326-34193-223802705881754=/root/.ansible/tmp/ansible-tmp-1726853742.1023326-34193-223802705881754 <<< 32935 1726853742.13168: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853742.13181: stdout chunk (state=3): >>><<< 32935 1726853742.13192: stderr chunk (state=3): >>><<< 32935 1726853742.13218: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853742.1023326-34193-223802705881754=/root/.ansible/tmp/ansible-tmp-1726853742.1023326-34193-223802705881754 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853742.13260: variable 'ansible_module_compression' from source: unknown 32935 1726853742.13330: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-32935vj31k4ae/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 32935 1726853742.13575: variable 'ansible_facts' from source: unknown 32935 1726853742.13578: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853742.1023326-34193-223802705881754/AnsiballZ_dnf.py 32935 1726853742.13729: Sending initial data 32935 1726853742.13732: Sent initial data (152 bytes) 32935 1726853742.14248: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32935 1726853742.14258: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853742.14273: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853742.14292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32935 1726853742.14385: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853742.14400: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853742.14412: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853742.14482: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853742.16169: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 32935 1726853742.16180: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 32935 1726853742.16184: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 32935 1726853742.16186: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 32935 1726853742.16189: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32935 1726853742.16231: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32935 1726853742.16267: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32935vj31k4ae/tmp3hnynt2w /root/.ansible/tmp/ansible-tmp-1726853742.1023326-34193-223802705881754/AnsiballZ_dnf.py <<< 32935 1726853742.16307: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853742.1023326-34193-223802705881754/AnsiballZ_dnf.py" <<< 32935 1726853742.16323: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-32935vj31k4ae/tmp3hnynt2w" to remote "/root/.ansible/tmp/ansible-tmp-1726853742.1023326-34193-223802705881754/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853742.1023326-34193-223802705881754/AnsiballZ_dnf.py" <<< 32935 1726853742.17351: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853742.17484: stderr chunk (state=3): >>><<< 32935 1726853742.17487: stdout chunk (state=3): >>><<< 32935 1726853742.17501: done transferring module to remote 32935 1726853742.17516: _low_level_execute_command(): starting 32935 1726853742.17526: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853742.1023326-34193-223802705881754/ /root/.ansible/tmp/ansible-tmp-1726853742.1023326-34193-223802705881754/AnsiballZ_dnf.py && sleep 0' 32935 1726853742.18224: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32935 1726853742.18246: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853742.18341: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853742.18363: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853742.18384: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853742.18449: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853742.20270: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853742.20297: stdout chunk (state=3): >>><<< 32935 1726853742.20299: stderr chunk (state=3): >>><<< 32935 1726853742.20318: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853742.20393: _low_level_execute_command(): starting 32935 1726853742.20396: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853742.1023326-34193-223802705881754/AnsiballZ_dnf.py && sleep 0' 32935 1726853742.21088: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853742.21180: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853742.21207: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853742.21289: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853742.62395: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 32935 1726853742.66447: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 32935 1726853742.66475: stderr chunk (state=3): >>><<< 32935 1726853742.66478: stdout chunk (state=3): >>><<< 32935 1726853742.66494: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 32935 1726853742.66529: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853742.1023326-34193-223802705881754/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32935 1726853742.66537: _low_level_execute_command(): starting 32935 1726853742.66540: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853742.1023326-34193-223802705881754/ > /dev/null 2>&1 && sleep 0' 32935 1726853742.66974: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853742.66978: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853742.66980: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address <<< 32935 1726853742.66982: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853742.66984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853742.67036: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853742.67043: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853742.67082: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853742.68931: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853742.68954: stderr chunk (state=3): >>><<< 32935 1726853742.68957: stdout chunk (state=3): >>><<< 32935 1726853742.68974: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853742.68980: handler run complete 32935 1726853742.69100: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 32935 1726853742.69243: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 32935 1726853742.69276: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 32935 1726853742.69300: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 32935 1726853742.69321: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 32935 1726853742.69378: variable '__install_status' from source: set_fact 32935 1726853742.69393: Evaluated conditional (__install_status is success): True 32935 1726853742.69405: attempt loop complete, returning result 32935 1726853742.69408: _execute() done 32935 1726853742.69410: dumping result to json 32935 1726853742.69415: done dumping result, returning 32935 1726853742.69422: done running TaskExecutor() for managed_node1/TASK: Install iproute [02083763-bbaf-84df-441d-00000000093f] 32935 1726853742.69424: sending task result for task 02083763-bbaf-84df-441d-00000000093f ok: [managed_node1] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 32935 1726853742.69600: no more pending results, returning what we have 32935 1726853742.69604: results queue empty 32935 1726853742.69605: checking for any_errors_fatal 32935 1726853742.69611: done checking for any_errors_fatal 32935 1726853742.69612: checking for max_fail_percentage 32935 1726853742.69614: done checking for max_fail_percentage 32935 1726853742.69615: checking to see if all hosts have failed and the running result is not ok 32935 1726853742.69616: done checking to see if all hosts have failed 32935 1726853742.69616: getting the remaining hosts for this loop 32935 1726853742.69618: done getting the remaining hosts for this loop 32935 1726853742.69622: getting the next task for host managed_node1 32935 1726853742.69630: done getting next task for host managed_node1 32935 1726853742.69633: ^ task is: TASK: Create veth interface {{ interface }} 32935 1726853742.69635: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853742.69639: getting variables 32935 1726853742.69641: in VariableManager get_vars() 32935 1726853742.69681: Calling all_inventory to load vars for managed_node1 32935 1726853742.69683: Calling groups_inventory to load vars for managed_node1 32935 1726853742.69685: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853742.69696: Calling all_plugins_play to load vars for managed_node1 32935 1726853742.69698: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853742.69701: Calling groups_plugins_play to load vars for managed_node1 32935 1726853742.70504: done sending task result for task 02083763-bbaf-84df-441d-00000000093f 32935 1726853742.70508: WORKER PROCESS EXITING 32935 1726853742.70518: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853742.71387: done with get_vars() 32935 1726853742.71405: done getting variables 32935 1726853742.71447: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 32935 1726853742.71537: variable 'interface' from source: play vars TASK [Create veth interface lsr101] ******************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Friday 20 September 2024 13:35:42 -0400 (0:00:00.682) 0:00:27.851 ****** 32935 1726853742.71559: entering _queue_task() for managed_node1/command 32935 1726853742.71800: worker is 1 (out of 1 available) 32935 1726853742.71816: exiting _queue_task() for managed_node1/command 32935 1726853742.71829: done queuing things up, now waiting for results queue to drain 32935 1726853742.71831: waiting for pending results... 32935 1726853742.72008: running TaskExecutor() for managed_node1/TASK: Create veth interface lsr101 32935 1726853742.72082: in run() - task 02083763-bbaf-84df-441d-000000000940 32935 1726853742.72092: variable 'ansible_search_path' from source: unknown 32935 1726853742.72096: variable 'ansible_search_path' from source: unknown 32935 1726853742.72309: variable 'interface' from source: play vars 32935 1726853742.72367: variable 'interface' from source: play vars 32935 1726853742.72420: variable 'interface' from source: play vars 32935 1726853742.72536: Loaded config def from plugin (lookup/items) 32935 1726853742.72544: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 32935 1726853742.72566: variable 'omit' from source: magic vars 32935 1726853742.72660: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853742.72669: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853742.72680: variable 'omit' from source: magic vars 32935 1726853742.72844: variable 'ansible_distribution_major_version' from source: facts 32935 1726853742.72849: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853742.72989: variable 'type' from source: play vars 32935 1726853742.72992: variable 'state' from source: include params 32935 1726853742.72995: variable 'interface' from source: play vars 32935 1726853742.72999: variable 'current_interfaces' from source: set_fact 32935 1726853742.73008: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): False 32935 1726853742.73011: when evaluation is False, skipping this task 32935 1726853742.73030: variable 'item' from source: unknown 32935 1726853742.73086: variable 'item' from source: unknown skipping: [managed_node1] => (item=ip link add lsr101 type veth peer name peerlsr101) => { "ansible_loop_var": "item", "changed": false, "false_condition": "type == 'veth' and state == 'present' and interface not in current_interfaces", "item": "ip link add lsr101 type veth peer name peerlsr101", "skip_reason": "Conditional result was False" } 32935 1726853742.73228: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853742.73231: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853742.73234: variable 'omit' from source: magic vars 32935 1726853742.73302: variable 'ansible_distribution_major_version' from source: facts 32935 1726853742.73305: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853742.73424: variable 'type' from source: play vars 32935 1726853742.73428: variable 'state' from source: include params 32935 1726853742.73431: variable 'interface' from source: play vars 32935 1726853742.73433: variable 'current_interfaces' from source: set_fact 32935 1726853742.73440: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): False 32935 1726853742.73443: when evaluation is False, skipping this task 32935 1726853742.73466: variable 'item' from source: unknown 32935 1726853742.73509: variable 'item' from source: unknown skipping: [managed_node1] => (item=ip link set peerlsr101 up) => { "ansible_loop_var": "item", "changed": false, "false_condition": "type == 'veth' and state == 'present' and interface not in current_interfaces", "item": "ip link set peerlsr101 up", "skip_reason": "Conditional result was False" } 32935 1726853742.73578: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853742.73582: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853742.73592: variable 'omit' from source: magic vars 32935 1726853742.73682: variable 'ansible_distribution_major_version' from source: facts 32935 1726853742.73685: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853742.73798: variable 'type' from source: play vars 32935 1726853742.73803: variable 'state' from source: include params 32935 1726853742.73806: variable 'interface' from source: play vars 32935 1726853742.73808: variable 'current_interfaces' from source: set_fact 32935 1726853742.73818: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): False 32935 1726853742.73821: when evaluation is False, skipping this task 32935 1726853742.73836: variable 'item' from source: unknown 32935 1726853742.73882: variable 'item' from source: unknown skipping: [managed_node1] => (item=ip link set lsr101 up) => { "ansible_loop_var": "item", "changed": false, "false_condition": "type == 'veth' and state == 'present' and interface not in current_interfaces", "item": "ip link set lsr101 up", "skip_reason": "Conditional result was False" } 32935 1726853742.73948: dumping result to json 32935 1726853742.73950: done dumping result, returning 32935 1726853742.73952: done running TaskExecutor() for managed_node1/TASK: Create veth interface lsr101 [02083763-bbaf-84df-441d-000000000940] 32935 1726853742.73954: sending task result for task 02083763-bbaf-84df-441d-000000000940 32935 1726853742.73990: done sending task result for task 02083763-bbaf-84df-441d-000000000940 32935 1726853742.73992: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false } MSG: All items skipped 32935 1726853742.74024: no more pending results, returning what we have 32935 1726853742.74026: results queue empty 32935 1726853742.74027: checking for any_errors_fatal 32935 1726853742.74034: done checking for any_errors_fatal 32935 1726853742.74035: checking for max_fail_percentage 32935 1726853742.74036: done checking for max_fail_percentage 32935 1726853742.74037: checking to see if all hosts have failed and the running result is not ok 32935 1726853742.74038: done checking to see if all hosts have failed 32935 1726853742.74038: getting the remaining hosts for this loop 32935 1726853742.74040: done getting the remaining hosts for this loop 32935 1726853742.74043: getting the next task for host managed_node1 32935 1726853742.74051: done getting next task for host managed_node1 32935 1726853742.74053: ^ task is: TASK: Set up veth as managed by NetworkManager 32935 1726853742.74055: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853742.74059: getting variables 32935 1726853742.74061: in VariableManager get_vars() 32935 1726853742.74100: Calling all_inventory to load vars for managed_node1 32935 1726853742.74102: Calling groups_inventory to load vars for managed_node1 32935 1726853742.74105: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853742.74114: Calling all_plugins_play to load vars for managed_node1 32935 1726853742.74117: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853742.74119: Calling groups_plugins_play to load vars for managed_node1 32935 1726853742.74993: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853742.75851: done with get_vars() 32935 1726853742.75867: done getting variables 32935 1726853742.75911: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Friday 20 September 2024 13:35:42 -0400 (0:00:00.043) 0:00:27.895 ****** 32935 1726853742.75936: entering _queue_task() for managed_node1/command 32935 1726853742.76158: worker is 1 (out of 1 available) 32935 1726853742.76174: exiting _queue_task() for managed_node1/command 32935 1726853742.76186: done queuing things up, now waiting for results queue to drain 32935 1726853742.76188: waiting for pending results... 32935 1726853742.76363: running TaskExecutor() for managed_node1/TASK: Set up veth as managed by NetworkManager 32935 1726853742.76437: in run() - task 02083763-bbaf-84df-441d-000000000941 32935 1726853742.76448: variable 'ansible_search_path' from source: unknown 32935 1726853742.76451: variable 'ansible_search_path' from source: unknown 32935 1726853742.76485: calling self._execute() 32935 1726853742.76558: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853742.76565: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853742.76574: variable 'omit' from source: magic vars 32935 1726853742.76847: variable 'ansible_distribution_major_version' from source: facts 32935 1726853742.76859: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853742.76956: variable 'type' from source: play vars 32935 1726853742.76962: variable 'state' from source: include params 32935 1726853742.76974: Evaluated conditional (type == 'veth' and state == 'present'): False 32935 1726853742.76978: when evaluation is False, skipping this task 32935 1726853742.76981: _execute() done 32935 1726853742.76983: dumping result to json 32935 1726853742.76986: done dumping result, returning 32935 1726853742.76988: done running TaskExecutor() for managed_node1/TASK: Set up veth as managed by NetworkManager [02083763-bbaf-84df-441d-000000000941] 32935 1726853742.76993: sending task result for task 02083763-bbaf-84df-441d-000000000941 32935 1726853742.77070: done sending task result for task 02083763-bbaf-84df-441d-000000000941 32935 1726853742.77075: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'veth' and state == 'present'", "skip_reason": "Conditional result was False" } 32935 1726853742.77124: no more pending results, returning what we have 32935 1726853742.77128: results queue empty 32935 1726853742.77129: checking for any_errors_fatal 32935 1726853742.77142: done checking for any_errors_fatal 32935 1726853742.77142: checking for max_fail_percentage 32935 1726853742.77144: done checking for max_fail_percentage 32935 1726853742.77145: checking to see if all hosts have failed and the running result is not ok 32935 1726853742.77146: done checking to see if all hosts have failed 32935 1726853742.77146: getting the remaining hosts for this loop 32935 1726853742.77148: done getting the remaining hosts for this loop 32935 1726853742.77151: getting the next task for host managed_node1 32935 1726853742.77158: done getting next task for host managed_node1 32935 1726853742.77160: ^ task is: TASK: Delete veth interface {{ interface }} 32935 1726853742.77163: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853742.77167: getting variables 32935 1726853742.77168: in VariableManager get_vars() 32935 1726853742.77204: Calling all_inventory to load vars for managed_node1 32935 1726853742.77207: Calling groups_inventory to load vars for managed_node1 32935 1726853742.77209: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853742.77217: Calling all_plugins_play to load vars for managed_node1 32935 1726853742.77220: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853742.77222: Calling groups_plugins_play to load vars for managed_node1 32935 1726853742.77962: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853742.78918: done with get_vars() 32935 1726853742.78932: done getting variables 32935 1726853742.78974: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 32935 1726853742.79053: variable 'interface' from source: play vars TASK [Delete veth interface lsr101] ******************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Friday 20 September 2024 13:35:42 -0400 (0:00:00.031) 0:00:27.926 ****** 32935 1726853742.79076: entering _queue_task() for managed_node1/command 32935 1726853742.79298: worker is 1 (out of 1 available) 32935 1726853742.79314: exiting _queue_task() for managed_node1/command 32935 1726853742.79327: done queuing things up, now waiting for results queue to drain 32935 1726853742.79328: waiting for pending results... 32935 1726853742.79505: running TaskExecutor() for managed_node1/TASK: Delete veth interface lsr101 32935 1726853742.79577: in run() - task 02083763-bbaf-84df-441d-000000000942 32935 1726853742.79587: variable 'ansible_search_path' from source: unknown 32935 1726853742.79590: variable 'ansible_search_path' from source: unknown 32935 1726853742.79618: calling self._execute() 32935 1726853742.79702: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853742.79706: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853742.79717: variable 'omit' from source: magic vars 32935 1726853742.79997: variable 'ansible_distribution_major_version' from source: facts 32935 1726853742.80004: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853742.80131: variable 'type' from source: play vars 32935 1726853742.80135: variable 'state' from source: include params 32935 1726853742.80139: variable 'interface' from source: play vars 32935 1726853742.80141: variable 'current_interfaces' from source: set_fact 32935 1726853742.80150: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): True 32935 1726853742.80155: variable 'omit' from source: magic vars 32935 1726853742.80180: variable 'omit' from source: magic vars 32935 1726853742.80247: variable 'interface' from source: play vars 32935 1726853742.80263: variable 'omit' from source: magic vars 32935 1726853742.80303: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32935 1726853742.80333: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32935 1726853742.80349: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32935 1726853742.80422: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853742.80426: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853742.80429: variable 'inventory_hostname' from source: host vars for 'managed_node1' 32935 1726853742.80431: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853742.80433: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853742.80473: Set connection var ansible_timeout to 10 32935 1726853742.80477: Set connection var ansible_shell_type to sh 32935 1726853742.80484: Set connection var ansible_pipelining to False 32935 1726853742.80487: Set connection var ansible_connection to ssh 32935 1726853742.80492: Set connection var ansible_shell_executable to /bin/sh 32935 1726853742.80497: Set connection var ansible_module_compression to ZIP_DEFLATED 32935 1726853742.80515: variable 'ansible_shell_executable' from source: unknown 32935 1726853742.80518: variable 'ansible_connection' from source: unknown 32935 1726853742.80520: variable 'ansible_module_compression' from source: unknown 32935 1726853742.80524: variable 'ansible_shell_type' from source: unknown 32935 1726853742.80527: variable 'ansible_shell_executable' from source: unknown 32935 1726853742.80529: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853742.80531: variable 'ansible_pipelining' from source: unknown 32935 1726853742.80533: variable 'ansible_timeout' from source: unknown 32935 1726853742.80536: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853742.80635: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32935 1726853742.80646: variable 'omit' from source: magic vars 32935 1726853742.80649: starting attempt loop 32935 1726853742.80653: running the handler 32935 1726853742.80668: _low_level_execute_command(): starting 32935 1726853742.80677: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32935 1726853742.81177: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853742.81181: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853742.81184: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853742.81240: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853742.81243: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853742.81246: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853742.81301: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853742.82967: stdout chunk (state=3): >>>/root <<< 32935 1726853742.83066: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853742.83096: stderr chunk (state=3): >>><<< 32935 1726853742.83099: stdout chunk (state=3): >>><<< 32935 1726853742.83118: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853742.83130: _low_level_execute_command(): starting 32935 1726853742.83142: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853742.8311853-34230-37028521878050 `" && echo ansible-tmp-1726853742.8311853-34230-37028521878050="` echo /root/.ansible/tmp/ansible-tmp-1726853742.8311853-34230-37028521878050 `" ) && sleep 0' 32935 1726853742.83578: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853742.83588: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 32935 1726853742.83590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 32935 1726853742.83593: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853742.83595: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 32935 1726853742.83597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853742.83639: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853742.83646: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853742.83648: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853742.83686: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853742.85558: stdout chunk (state=3): >>>ansible-tmp-1726853742.8311853-34230-37028521878050=/root/.ansible/tmp/ansible-tmp-1726853742.8311853-34230-37028521878050 <<< 32935 1726853742.85663: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853742.85691: stderr chunk (state=3): >>><<< 32935 1726853742.85695: stdout chunk (state=3): >>><<< 32935 1726853742.85710: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853742.8311853-34230-37028521878050=/root/.ansible/tmp/ansible-tmp-1726853742.8311853-34230-37028521878050 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853742.85733: variable 'ansible_module_compression' from source: unknown 32935 1726853742.85779: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-32935vj31k4ae/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 32935 1726853742.85806: variable 'ansible_facts' from source: unknown 32935 1726853742.85861: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853742.8311853-34230-37028521878050/AnsiballZ_command.py 32935 1726853742.85959: Sending initial data 32935 1726853742.85965: Sent initial data (155 bytes) 32935 1726853742.86409: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853742.86413: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 32935 1726853742.86415: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 32935 1726853742.86417: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32935 1726853742.86419: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853742.86469: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853742.86476: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853742.86512: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853742.88067: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 32935 1726853742.88074: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32935 1726853742.88106: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32935 1726853742.88153: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32935vj31k4ae/tmpivb72yng /root/.ansible/tmp/ansible-tmp-1726853742.8311853-34230-37028521878050/AnsiballZ_command.py <<< 32935 1726853742.88155: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853742.8311853-34230-37028521878050/AnsiballZ_command.py" <<< 32935 1726853742.88186: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-32935vj31k4ae/tmpivb72yng" to remote "/root/.ansible/tmp/ansible-tmp-1726853742.8311853-34230-37028521878050/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853742.8311853-34230-37028521878050/AnsiballZ_command.py" <<< 32935 1726853742.88735: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853742.88744: stderr chunk (state=3): >>><<< 32935 1726853742.88747: stdout chunk (state=3): >>><<< 32935 1726853742.88765: done transferring module to remote 32935 1726853742.88774: _low_level_execute_command(): starting 32935 1726853742.88780: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853742.8311853-34230-37028521878050/ /root/.ansible/tmp/ansible-tmp-1726853742.8311853-34230-37028521878050/AnsiballZ_command.py && sleep 0' 32935 1726853742.89206: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853742.89210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 32935 1726853742.89215: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 32935 1726853742.89218: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853742.89220: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853742.89265: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853742.89268: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853742.89319: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853742.91049: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853742.91079: stderr chunk (state=3): >>><<< 32935 1726853742.91082: stdout chunk (state=3): >>><<< 32935 1726853742.91096: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853742.91099: _low_level_execute_command(): starting 32935 1726853742.91104: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853742.8311853-34230-37028521878050/AnsiballZ_command.py && sleep 0' 32935 1726853742.91539: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853742.91542: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853742.91545: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853742.91547: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 32935 1726853742.91549: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853742.91596: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853742.91599: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853742.91650: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853743.07754: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "lsr101", "type", "veth"], "start": "2024-09-20 13:35:43.067998", "end": "2024-09-20 13:35:43.075514", "delta": "0:00:00.007516", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del lsr101 type veth", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 32935 1726853743.09338: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 32935 1726853743.09370: stderr chunk (state=3): >>><<< 32935 1726853743.09375: stdout chunk (state=3): >>><<< 32935 1726853743.09391: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "lsr101", "type", "veth"], "start": "2024-09-20 13:35:43.067998", "end": "2024-09-20 13:35:43.075514", "delta": "0:00:00.007516", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del lsr101 type veth", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 32935 1726853743.09421: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del lsr101 type veth', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853742.8311853-34230-37028521878050/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32935 1726853743.09428: _low_level_execute_command(): starting 32935 1726853743.09433: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853742.8311853-34230-37028521878050/ > /dev/null 2>&1 && sleep 0' 32935 1726853743.09901: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32935 1726853743.09904: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32935 1726853743.09906: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853743.09909: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853743.09911: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853743.09964: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853743.09968: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853743.09970: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853743.10039: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853743.11883: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853743.11924: stderr chunk (state=3): >>><<< 32935 1726853743.11927: stdout chunk (state=3): >>><<< 32935 1726853743.11936: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853743.11942: handler run complete 32935 1726853743.11959: Evaluated conditional (False): False 32935 1726853743.11969: attempt loop complete, returning result 32935 1726853743.11974: _execute() done 32935 1726853743.11982: dumping result to json 32935 1726853743.11987: done dumping result, returning 32935 1726853743.11994: done running TaskExecutor() for managed_node1/TASK: Delete veth interface lsr101 [02083763-bbaf-84df-441d-000000000942] 32935 1726853743.12000: sending task result for task 02083763-bbaf-84df-441d-000000000942 32935 1726853743.12101: done sending task result for task 02083763-bbaf-84df-441d-000000000942 32935 1726853743.12105: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "ip", "link", "del", "lsr101", "type", "veth" ], "delta": "0:00:00.007516", "end": "2024-09-20 13:35:43.075514", "rc": 0, "start": "2024-09-20 13:35:43.067998" } 32935 1726853743.12174: no more pending results, returning what we have 32935 1726853743.12178: results queue empty 32935 1726853743.12179: checking for any_errors_fatal 32935 1726853743.12185: done checking for any_errors_fatal 32935 1726853743.12185: checking for max_fail_percentage 32935 1726853743.12187: done checking for max_fail_percentage 32935 1726853743.12188: checking to see if all hosts have failed and the running result is not ok 32935 1726853743.12189: done checking to see if all hosts have failed 32935 1726853743.12190: getting the remaining hosts for this loop 32935 1726853743.12192: done getting the remaining hosts for this loop 32935 1726853743.12195: getting the next task for host managed_node1 32935 1726853743.12203: done getting next task for host managed_node1 32935 1726853743.12207: ^ task is: TASK: Create dummy interface {{ interface }} 32935 1726853743.12210: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853743.12214: getting variables 32935 1726853743.12216: in VariableManager get_vars() 32935 1726853743.12260: Calling all_inventory to load vars for managed_node1 32935 1726853743.12263: Calling groups_inventory to load vars for managed_node1 32935 1726853743.12265: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853743.12282: Calling all_plugins_play to load vars for managed_node1 32935 1726853743.12285: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853743.12288: Calling groups_plugins_play to load vars for managed_node1 32935 1726853743.13076: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853743.14066: done with get_vars() 32935 1726853743.14090: done getting variables 32935 1726853743.14149: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 32935 1726853743.14262: variable 'interface' from source: play vars TASK [Create dummy interface lsr101] ******************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Friday 20 September 2024 13:35:43 -0400 (0:00:00.352) 0:00:28.278 ****** 32935 1726853743.14296: entering _queue_task() for managed_node1/command 32935 1726853743.14614: worker is 1 (out of 1 available) 32935 1726853743.14627: exiting _queue_task() for managed_node1/command 32935 1726853743.14641: done queuing things up, now waiting for results queue to drain 32935 1726853743.14642: waiting for pending results... 32935 1726853743.15087: running TaskExecutor() for managed_node1/TASK: Create dummy interface lsr101 32935 1726853743.15092: in run() - task 02083763-bbaf-84df-441d-000000000943 32935 1726853743.15095: variable 'ansible_search_path' from source: unknown 32935 1726853743.15098: variable 'ansible_search_path' from source: unknown 32935 1726853743.15101: calling self._execute() 32935 1726853743.15191: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853743.15203: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853743.15222: variable 'omit' from source: magic vars 32935 1726853743.15597: variable 'ansible_distribution_major_version' from source: facts 32935 1726853743.15614: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853743.15819: variable 'type' from source: play vars 32935 1726853743.15829: variable 'state' from source: include params 32935 1726853743.15838: variable 'interface' from source: play vars 32935 1726853743.15847: variable 'current_interfaces' from source: set_fact 32935 1726853743.15859: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 32935 1726853743.15874: when evaluation is False, skipping this task 32935 1726853743.15882: _execute() done 32935 1726853743.15889: dumping result to json 32935 1726853743.15896: done dumping result, returning 32935 1726853743.15905: done running TaskExecutor() for managed_node1/TASK: Create dummy interface lsr101 [02083763-bbaf-84df-441d-000000000943] 32935 1726853743.15977: sending task result for task 02083763-bbaf-84df-441d-000000000943 32935 1726853743.16039: done sending task result for task 02083763-bbaf-84df-441d-000000000943 32935 1726853743.16043: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 32935 1726853743.16127: no more pending results, returning what we have 32935 1726853743.16131: results queue empty 32935 1726853743.16133: checking for any_errors_fatal 32935 1726853743.16145: done checking for any_errors_fatal 32935 1726853743.16146: checking for max_fail_percentage 32935 1726853743.16148: done checking for max_fail_percentage 32935 1726853743.16148: checking to see if all hosts have failed and the running result is not ok 32935 1726853743.16150: done checking to see if all hosts have failed 32935 1726853743.16150: getting the remaining hosts for this loop 32935 1726853743.16152: done getting the remaining hosts for this loop 32935 1726853743.16156: getting the next task for host managed_node1 32935 1726853743.16165: done getting next task for host managed_node1 32935 1726853743.16167: ^ task is: TASK: Delete dummy interface {{ interface }} 32935 1726853743.16172: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853743.16177: getting variables 32935 1726853743.16179: in VariableManager get_vars() 32935 1726853743.16223: Calling all_inventory to load vars for managed_node1 32935 1726853743.16226: Calling groups_inventory to load vars for managed_node1 32935 1726853743.16229: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853743.16243: Calling all_plugins_play to load vars for managed_node1 32935 1726853743.16246: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853743.16249: Calling groups_plugins_play to load vars for managed_node1 32935 1726853743.17921: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853743.19457: done with get_vars() 32935 1726853743.19481: done getting variables 32935 1726853743.19542: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 32935 1726853743.19652: variable 'interface' from source: play vars TASK [Delete dummy interface lsr101] ******************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Friday 20 September 2024 13:35:43 -0400 (0:00:00.053) 0:00:28.332 ****** 32935 1726853743.19685: entering _queue_task() for managed_node1/command 32935 1726853743.20022: worker is 1 (out of 1 available) 32935 1726853743.20036: exiting _queue_task() for managed_node1/command 32935 1726853743.20047: done queuing things up, now waiting for results queue to drain 32935 1726853743.20049: waiting for pending results... 32935 1726853743.20490: running TaskExecutor() for managed_node1/TASK: Delete dummy interface lsr101 32935 1726853743.20496: in run() - task 02083763-bbaf-84df-441d-000000000944 32935 1726853743.20499: variable 'ansible_search_path' from source: unknown 32935 1726853743.20501: variable 'ansible_search_path' from source: unknown 32935 1726853743.20504: calling self._execute() 32935 1726853743.20600: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853743.20610: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853743.20676: variable 'omit' from source: magic vars 32935 1726853743.20999: variable 'ansible_distribution_major_version' from source: facts 32935 1726853743.21016: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853743.21214: variable 'type' from source: play vars 32935 1726853743.21272: variable 'state' from source: include params 32935 1726853743.21275: variable 'interface' from source: play vars 32935 1726853743.21278: variable 'current_interfaces' from source: set_fact 32935 1726853743.21280: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 32935 1726853743.21283: when evaluation is False, skipping this task 32935 1726853743.21284: _execute() done 32935 1726853743.21287: dumping result to json 32935 1726853743.21289: done dumping result, returning 32935 1726853743.21291: done running TaskExecutor() for managed_node1/TASK: Delete dummy interface lsr101 [02083763-bbaf-84df-441d-000000000944] 32935 1726853743.21293: sending task result for task 02083763-bbaf-84df-441d-000000000944 skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 32935 1726853743.21538: no more pending results, returning what we have 32935 1726853743.21541: results queue empty 32935 1726853743.21543: checking for any_errors_fatal 32935 1726853743.21549: done checking for any_errors_fatal 32935 1726853743.21549: checking for max_fail_percentage 32935 1726853743.21552: done checking for max_fail_percentage 32935 1726853743.21552: checking to see if all hosts have failed and the running result is not ok 32935 1726853743.21553: done checking to see if all hosts have failed 32935 1726853743.21554: getting the remaining hosts for this loop 32935 1726853743.21555: done getting the remaining hosts for this loop 32935 1726853743.21559: getting the next task for host managed_node1 32935 1726853743.21567: done getting next task for host managed_node1 32935 1726853743.21570: ^ task is: TASK: Create tap interface {{ interface }} 32935 1726853743.21576: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853743.21582: getting variables 32935 1726853743.21585: in VariableManager get_vars() 32935 1726853743.21634: Calling all_inventory to load vars for managed_node1 32935 1726853743.21637: Calling groups_inventory to load vars for managed_node1 32935 1726853743.21639: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853743.21654: Calling all_plugins_play to load vars for managed_node1 32935 1726853743.21656: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853743.21659: Calling groups_plugins_play to load vars for managed_node1 32935 1726853743.22184: done sending task result for task 02083763-bbaf-84df-441d-000000000944 32935 1726853743.22188: WORKER PROCESS EXITING 32935 1726853743.22967: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853743.24548: done with get_vars() 32935 1726853743.24574: done getting variables 32935 1726853743.24632: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 32935 1726853743.24742: variable 'interface' from source: play vars TASK [Create tap interface lsr101] ********************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Friday 20 September 2024 13:35:43 -0400 (0:00:00.050) 0:00:28.383 ****** 32935 1726853743.24774: entering _queue_task() for managed_node1/command 32935 1726853743.25120: worker is 1 (out of 1 available) 32935 1726853743.25133: exiting _queue_task() for managed_node1/command 32935 1726853743.25146: done queuing things up, now waiting for results queue to drain 32935 1726853743.25148: waiting for pending results... 32935 1726853743.25428: running TaskExecutor() for managed_node1/TASK: Create tap interface lsr101 32935 1726853743.25544: in run() - task 02083763-bbaf-84df-441d-000000000945 32935 1726853743.25564: variable 'ansible_search_path' from source: unknown 32935 1726853743.25574: variable 'ansible_search_path' from source: unknown 32935 1726853743.25625: calling self._execute() 32935 1726853743.25739: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853743.25752: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853743.25769: variable 'omit' from source: magic vars 32935 1726853743.26182: variable 'ansible_distribution_major_version' from source: facts 32935 1726853743.26202: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853743.26420: variable 'type' from source: play vars 32935 1726853743.26432: variable 'state' from source: include params 32935 1726853743.26441: variable 'interface' from source: play vars 32935 1726853743.26450: variable 'current_interfaces' from source: set_fact 32935 1726853743.26469: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 32935 1726853743.26478: when evaluation is False, skipping this task 32935 1726853743.26486: _execute() done 32935 1726853743.26494: dumping result to json 32935 1726853743.26501: done dumping result, returning 32935 1726853743.26512: done running TaskExecutor() for managed_node1/TASK: Create tap interface lsr101 [02083763-bbaf-84df-441d-000000000945] 32935 1726853743.26572: sending task result for task 02083763-bbaf-84df-441d-000000000945 32935 1726853743.26642: done sending task result for task 02083763-bbaf-84df-441d-000000000945 32935 1726853743.26646: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 32935 1726853743.26699: no more pending results, returning what we have 32935 1726853743.26703: results queue empty 32935 1726853743.26704: checking for any_errors_fatal 32935 1726853743.26711: done checking for any_errors_fatal 32935 1726853743.26713: checking for max_fail_percentage 32935 1726853743.26715: done checking for max_fail_percentage 32935 1726853743.26716: checking to see if all hosts have failed and the running result is not ok 32935 1726853743.26717: done checking to see if all hosts have failed 32935 1726853743.26718: getting the remaining hosts for this loop 32935 1726853743.26720: done getting the remaining hosts for this loop 32935 1726853743.26724: getting the next task for host managed_node1 32935 1726853743.26733: done getting next task for host managed_node1 32935 1726853743.26736: ^ task is: TASK: Delete tap interface {{ interface }} 32935 1726853743.26740: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853743.26743: getting variables 32935 1726853743.26746: in VariableManager get_vars() 32935 1726853743.26794: Calling all_inventory to load vars for managed_node1 32935 1726853743.26797: Calling groups_inventory to load vars for managed_node1 32935 1726853743.26801: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853743.26815: Calling all_plugins_play to load vars for managed_node1 32935 1726853743.26819: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853743.26822: Calling groups_plugins_play to load vars for managed_node1 32935 1726853743.28521: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853743.30065: done with get_vars() 32935 1726853743.30092: done getting variables 32935 1726853743.30150: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 32935 1726853743.30261: variable 'interface' from source: play vars TASK [Delete tap interface lsr101] ********************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Friday 20 September 2024 13:35:43 -0400 (0:00:00.055) 0:00:28.438 ****** 32935 1726853743.30293: entering _queue_task() for managed_node1/command 32935 1726853743.30620: worker is 1 (out of 1 available) 32935 1726853743.30633: exiting _queue_task() for managed_node1/command 32935 1726853743.30645: done queuing things up, now waiting for results queue to drain 32935 1726853743.30646: waiting for pending results... 32935 1726853743.30999: running TaskExecutor() for managed_node1/TASK: Delete tap interface lsr101 32935 1726853743.31077: in run() - task 02083763-bbaf-84df-441d-000000000946 32935 1726853743.31081: variable 'ansible_search_path' from source: unknown 32935 1726853743.31083: variable 'ansible_search_path' from source: unknown 32935 1726853743.31109: calling self._execute() 32935 1726853743.31211: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853743.31222: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853743.31276: variable 'omit' from source: magic vars 32935 1726853743.31610: variable 'ansible_distribution_major_version' from source: facts 32935 1726853743.31627: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853743.31822: variable 'type' from source: play vars 32935 1726853743.31832: variable 'state' from source: include params 32935 1726853743.31840: variable 'interface' from source: play vars 32935 1726853743.32078: variable 'current_interfaces' from source: set_fact 32935 1726853743.32083: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 32935 1726853743.32085: when evaluation is False, skipping this task 32935 1726853743.32087: _execute() done 32935 1726853743.32089: dumping result to json 32935 1726853743.32091: done dumping result, returning 32935 1726853743.32094: done running TaskExecutor() for managed_node1/TASK: Delete tap interface lsr101 [02083763-bbaf-84df-441d-000000000946] 32935 1726853743.32096: sending task result for task 02083763-bbaf-84df-441d-000000000946 32935 1726853743.32162: done sending task result for task 02083763-bbaf-84df-441d-000000000946 32935 1726853743.32166: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 32935 1726853743.32216: no more pending results, returning what we have 32935 1726853743.32220: results queue empty 32935 1726853743.32221: checking for any_errors_fatal 32935 1726853743.32227: done checking for any_errors_fatal 32935 1726853743.32228: checking for max_fail_percentage 32935 1726853743.32229: done checking for max_fail_percentage 32935 1726853743.32230: checking to see if all hosts have failed and the running result is not ok 32935 1726853743.32231: done checking to see if all hosts have failed 32935 1726853743.32232: getting the remaining hosts for this loop 32935 1726853743.32234: done getting the remaining hosts for this loop 32935 1726853743.32240: getting the next task for host managed_node1 32935 1726853743.32252: done getting next task for host managed_node1 32935 1726853743.32255: ^ task is: TASK: Verify network state restored to default 32935 1726853743.32258: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853743.32262: getting variables 32935 1726853743.32264: in VariableManager get_vars() 32935 1726853743.32312: Calling all_inventory to load vars for managed_node1 32935 1726853743.32315: Calling groups_inventory to load vars for managed_node1 32935 1726853743.32318: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853743.32332: Calling all_plugins_play to load vars for managed_node1 32935 1726853743.32335: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853743.32338: Calling groups_plugins_play to load vars for managed_node1 32935 1726853743.34023: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853743.35680: done with get_vars() 32935 1726853743.35702: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_vlan_mtu.yml:77 Friday 20 September 2024 13:35:43 -0400 (0:00:00.056) 0:00:28.494 ****** 32935 1726853743.35899: entering _queue_task() for managed_node1/include_tasks 32935 1726853743.36440: worker is 1 (out of 1 available) 32935 1726853743.36454: exiting _queue_task() for managed_node1/include_tasks 32935 1726853743.36467: done queuing things up, now waiting for results queue to drain 32935 1726853743.36469: waiting for pending results... 32935 1726853743.36957: running TaskExecutor() for managed_node1/TASK: Verify network state restored to default 32935 1726853743.37280: in run() - task 02083763-bbaf-84df-441d-0000000000ab 32935 1726853743.37305: variable 'ansible_search_path' from source: unknown 32935 1726853743.37349: calling self._execute() 32935 1726853743.37583: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853743.37601: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853743.37620: variable 'omit' from source: magic vars 32935 1726853743.38508: variable 'ansible_distribution_major_version' from source: facts 32935 1726853743.38676: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853743.38680: _execute() done 32935 1726853743.38683: dumping result to json 32935 1726853743.38686: done dumping result, returning 32935 1726853743.38690: done running TaskExecutor() for managed_node1/TASK: Verify network state restored to default [02083763-bbaf-84df-441d-0000000000ab] 32935 1726853743.38693: sending task result for task 02083763-bbaf-84df-441d-0000000000ab 32935 1726853743.38775: done sending task result for task 02083763-bbaf-84df-441d-0000000000ab 32935 1726853743.38779: WORKER PROCESS EXITING 32935 1726853743.38810: no more pending results, returning what we have 32935 1726853743.38815: in VariableManager get_vars() 32935 1726853743.38867: Calling all_inventory to load vars for managed_node1 32935 1726853743.38870: Calling groups_inventory to load vars for managed_node1 32935 1726853743.38876: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853743.38891: Calling all_plugins_play to load vars for managed_node1 32935 1726853743.38894: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853743.38897: Calling groups_plugins_play to load vars for managed_node1 32935 1726853743.40433: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853743.41932: done with get_vars() 32935 1726853743.41958: variable 'ansible_search_path' from source: unknown 32935 1726853743.41978: we have included files to process 32935 1726853743.41979: generating all_blocks data 32935 1726853743.41982: done generating all_blocks data 32935 1726853743.41987: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 32935 1726853743.41988: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 32935 1726853743.41991: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 32935 1726853743.42401: done processing included file 32935 1726853743.42404: iterating over new_blocks loaded from include file 32935 1726853743.42405: in VariableManager get_vars() 32935 1726853743.42426: done with get_vars() 32935 1726853743.42428: filtering new block on tags 32935 1726853743.42446: done filtering new block on tags 32935 1726853743.42449: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed_node1 32935 1726853743.42454: extending task lists for all hosts with included blocks 32935 1726853743.45486: done extending task lists 32935 1726853743.45488: done processing included files 32935 1726853743.45489: results queue empty 32935 1726853743.45490: checking for any_errors_fatal 32935 1726853743.45493: done checking for any_errors_fatal 32935 1726853743.45494: checking for max_fail_percentage 32935 1726853743.45495: done checking for max_fail_percentage 32935 1726853743.45496: checking to see if all hosts have failed and the running result is not ok 32935 1726853743.45497: done checking to see if all hosts have failed 32935 1726853743.45497: getting the remaining hosts for this loop 32935 1726853743.45499: done getting the remaining hosts for this loop 32935 1726853743.45501: getting the next task for host managed_node1 32935 1726853743.45505: done getting next task for host managed_node1 32935 1726853743.45507: ^ task is: TASK: Check routes and DNS 32935 1726853743.45510: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853743.45512: getting variables 32935 1726853743.45513: in VariableManager get_vars() 32935 1726853743.45532: Calling all_inventory to load vars for managed_node1 32935 1726853743.45535: Calling groups_inventory to load vars for managed_node1 32935 1726853743.45537: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853743.45543: Calling all_plugins_play to load vars for managed_node1 32935 1726853743.45546: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853743.45548: Calling groups_plugins_play to load vars for managed_node1 32935 1726853743.46765: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853743.48294: done with get_vars() 32935 1726853743.48324: done getting variables 32935 1726853743.48377: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Friday 20 September 2024 13:35:43 -0400 (0:00:00.125) 0:00:28.619 ****** 32935 1726853743.48410: entering _queue_task() for managed_node1/shell 32935 1726853743.48794: worker is 1 (out of 1 available) 32935 1726853743.48807: exiting _queue_task() for managed_node1/shell 32935 1726853743.48819: done queuing things up, now waiting for results queue to drain 32935 1726853743.48821: waiting for pending results... 32935 1726853743.49293: running TaskExecutor() for managed_node1/TASK: Check routes and DNS 32935 1726853743.49298: in run() - task 02083763-bbaf-84df-441d-000000000b17 32935 1726853743.49301: variable 'ansible_search_path' from source: unknown 32935 1726853743.49303: variable 'ansible_search_path' from source: unknown 32935 1726853743.49305: calling self._execute() 32935 1726853743.49412: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853743.49424: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853743.49498: variable 'omit' from source: magic vars 32935 1726853743.49829: variable 'ansible_distribution_major_version' from source: facts 32935 1726853743.49846: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853743.49855: variable 'omit' from source: magic vars 32935 1726853743.49894: variable 'omit' from source: magic vars 32935 1726853743.49934: variable 'omit' from source: magic vars 32935 1726853743.49981: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32935 1726853743.50022: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32935 1726853743.50051: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32935 1726853743.50073: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853743.50089: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853743.50147: variable 'inventory_hostname' from source: host vars for 'managed_node1' 32935 1726853743.50150: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853743.50153: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853743.50240: Set connection var ansible_timeout to 10 32935 1726853743.50258: Set connection var ansible_shell_type to sh 32935 1726853743.50365: Set connection var ansible_pipelining to False 32935 1726853743.50368: Set connection var ansible_connection to ssh 32935 1726853743.50370: Set connection var ansible_shell_executable to /bin/sh 32935 1726853743.50374: Set connection var ansible_module_compression to ZIP_DEFLATED 32935 1726853743.50376: variable 'ansible_shell_executable' from source: unknown 32935 1726853743.50378: variable 'ansible_connection' from source: unknown 32935 1726853743.50380: variable 'ansible_module_compression' from source: unknown 32935 1726853743.50382: variable 'ansible_shell_type' from source: unknown 32935 1726853743.50385: variable 'ansible_shell_executable' from source: unknown 32935 1726853743.50387: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853743.50389: variable 'ansible_pipelining' from source: unknown 32935 1726853743.50391: variable 'ansible_timeout' from source: unknown 32935 1726853743.50394: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853743.50498: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32935 1726853743.50515: variable 'omit' from source: magic vars 32935 1726853743.50526: starting attempt loop 32935 1726853743.50533: running the handler 32935 1726853743.50549: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32935 1726853743.50576: _low_level_execute_command(): starting 32935 1726853743.50595: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32935 1726853743.51361: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853743.51469: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853743.51489: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853743.51577: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853743.53300: stdout chunk (state=3): >>>/root <<< 32935 1726853743.53392: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853743.53599: stderr chunk (state=3): >>><<< 32935 1726853743.53602: stdout chunk (state=3): >>><<< 32935 1726853743.53606: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853743.53608: _low_level_execute_command(): starting 32935 1726853743.53611: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853743.5352721-34255-149565168720761 `" && echo ansible-tmp-1726853743.5352721-34255-149565168720761="` echo /root/.ansible/tmp/ansible-tmp-1726853743.5352721-34255-149565168720761 `" ) && sleep 0' 32935 1726853743.54801: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853743.54804: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 32935 1726853743.54815: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 32935 1726853743.54818: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 32935 1726853743.54821: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853743.54865: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853743.54880: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853743.54937: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853743.57381: stdout chunk (state=3): >>>ansible-tmp-1726853743.5352721-34255-149565168720761=/root/.ansible/tmp/ansible-tmp-1726853743.5352721-34255-149565168720761 <<< 32935 1726853743.57385: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853743.57387: stdout chunk (state=3): >>><<< 32935 1726853743.57390: stderr chunk (state=3): >>><<< 32935 1726853743.57392: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853743.5352721-34255-149565168720761=/root/.ansible/tmp/ansible-tmp-1726853743.5352721-34255-149565168720761 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853743.57395: variable 'ansible_module_compression' from source: unknown 32935 1726853743.57397: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-32935vj31k4ae/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 32935 1726853743.57399: variable 'ansible_facts' from source: unknown 32935 1726853743.57747: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853743.5352721-34255-149565168720761/AnsiballZ_command.py 32935 1726853743.58241: Sending initial data 32935 1726853743.58368: Sent initial data (156 bytes) 32935 1726853743.59459: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32935 1726853743.59657: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853743.59790: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853743.59858: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853743.61539: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 debug2: Sending SSH2_FXP_REALPATH "." <<< 32935 1726853743.61543: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853743.5352721-34255-149565168720761/AnsiballZ_command.py" <<< 32935 1726853743.61546: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32935vj31k4ae/tmpvxsjhpiz /root/.ansible/tmp/ansible-tmp-1726853743.5352721-34255-149565168720761/AnsiballZ_command.py <<< 32935 1726853743.61575: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-32935vj31k4ae/tmpvxsjhpiz" to remote "/root/.ansible/tmp/ansible-tmp-1726853743.5352721-34255-149565168720761/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853743.5352721-34255-149565168720761/AnsiballZ_command.py" <<< 32935 1726853743.63140: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853743.63144: stderr chunk (state=3): >>><<< 32935 1726853743.63147: stdout chunk (state=3): >>><<< 32935 1726853743.63255: done transferring module to remote 32935 1726853743.63262: _low_level_execute_command(): starting 32935 1726853743.63265: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853743.5352721-34255-149565168720761/ /root/.ansible/tmp/ansible-tmp-1726853743.5352721-34255-149565168720761/AnsiballZ_command.py && sleep 0' 32935 1726853743.64588: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32935 1726853743.64663: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853743.64682: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853743.64722: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration <<< 32935 1726853743.64731: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853743.64831: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853743.64896: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853743.64946: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853743.65010: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853743.65202: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853743.67079: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853743.67083: stdout chunk (state=3): >>><<< 32935 1726853743.67085: stderr chunk (state=3): >>><<< 32935 1726853743.67104: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853743.67264: _low_level_execute_command(): starting 32935 1726853743.67268: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853743.5352721-34255-149565168720761/AnsiballZ_command.py && sleep 0' 32935 1726853743.68331: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853743.68335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853743.68337: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 32935 1726853743.68339: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853743.68341: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853743.68609: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853743.68648: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853743.84709: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 02:3a:e7:40:bc:9f brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.45.153/22 brd 10.31.47.255 scope global dynamic noprefixroute eth0\n valid_lft 2705sec preferred_lft 2705sec\n inet6 fe80::3a:e7ff:fe40:bc9f/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.44.1 dev eth0 proto dhcp src 10.31.45.153 metric 100 \n10.31.44.0/22 dev eth0 proto kernel scope link src 10.31.45.153 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 13:35:43.837246", "end": "2024-09-20 13:35:43.845888", "delta": "0:00:00.008642", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 32935 1726853743.86608: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 32935 1726853743.86612: stdout chunk (state=3): >>><<< 32935 1726853743.86614: stderr chunk (state=3): >>><<< 32935 1726853743.86617: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 02:3a:e7:40:bc:9f brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.45.153/22 brd 10.31.47.255 scope global dynamic noprefixroute eth0\n valid_lft 2705sec preferred_lft 2705sec\n inet6 fe80::3a:e7ff:fe40:bc9f/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.44.1 dev eth0 proto dhcp src 10.31.45.153 metric 100 \n10.31.44.0/22 dev eth0 proto kernel scope link src 10.31.45.153 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 13:35:43.837246", "end": "2024-09-20 13:35:43.845888", "delta": "0:00:00.008642", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 32935 1726853743.86625: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853743.5352721-34255-149565168720761/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32935 1726853743.86627: _low_level_execute_command(): starting 32935 1726853743.86629: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853743.5352721-34255-149565168720761/ > /dev/null 2>&1 && sleep 0' 32935 1726853743.87743: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853743.87747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853743.87750: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853743.87752: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 32935 1726853743.87754: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853743.88112: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853743.88165: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853743.89950: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853743.89984: stderr chunk (state=3): >>><<< 32935 1726853743.90060: stdout chunk (state=3): >>><<< 32935 1726853743.90080: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853743.90094: handler run complete 32935 1726853743.90257: Evaluated conditional (False): False 32935 1726853743.90263: attempt loop complete, returning result 32935 1726853743.90265: _execute() done 32935 1726853743.90267: dumping result to json 32935 1726853743.90270: done dumping result, returning 32935 1726853743.90275: done running TaskExecutor() for managed_node1/TASK: Check routes and DNS [02083763-bbaf-84df-441d-000000000b17] 32935 1726853743.90382: sending task result for task 02083763-bbaf-84df-441d-000000000b17 32935 1726853743.90460: done sending task result for task 02083763-bbaf-84df-441d-000000000b17 32935 1726853743.90463: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.008642", "end": "2024-09-20 13:35:43.845888", "rc": 0, "start": "2024-09-20 13:35:43.837246" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 02:3a:e7:40:bc:9f brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.45.153/22 brd 10.31.47.255 scope global dynamic noprefixroute eth0 valid_lft 2705sec preferred_lft 2705sec inet6 fe80::3a:e7ff:fe40:bc9f/64 scope link noprefixroute valid_lft forever preferred_lft forever IP ROUTE default via 10.31.44.1 dev eth0 proto dhcp src 10.31.45.153 metric 100 10.31.44.0/22 dev eth0 proto kernel scope link src 10.31.45.153 metric 100 IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # Generated by NetworkManager search us-east-1.aws.redhat.com nameserver 10.29.169.13 nameserver 10.29.170.12 nameserver 10.2.32.1 32935 1726853743.90540: no more pending results, returning what we have 32935 1726853743.90544: results queue empty 32935 1726853743.90546: checking for any_errors_fatal 32935 1726853743.90547: done checking for any_errors_fatal 32935 1726853743.90548: checking for max_fail_percentage 32935 1726853743.90550: done checking for max_fail_percentage 32935 1726853743.90551: checking to see if all hosts have failed and the running result is not ok 32935 1726853743.90552: done checking to see if all hosts have failed 32935 1726853743.90553: getting the remaining hosts for this loop 32935 1726853743.90555: done getting the remaining hosts for this loop 32935 1726853743.90558: getting the next task for host managed_node1 32935 1726853743.90567: done getting next task for host managed_node1 32935 1726853743.90569: ^ task is: TASK: Verify DNS and network connectivity 32935 1726853743.90776: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853743.90783: getting variables 32935 1726853743.90785: in VariableManager get_vars() 32935 1726853743.90828: Calling all_inventory to load vars for managed_node1 32935 1726853743.90835: Calling groups_inventory to load vars for managed_node1 32935 1726853743.90839: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853743.90851: Calling all_plugins_play to load vars for managed_node1 32935 1726853743.90854: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853743.90858: Calling groups_plugins_play to load vars for managed_node1 32935 1726853743.93840: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853743.95654: done with get_vars() 32935 1726853743.95683: done getting variables 32935 1726853743.95767: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Friday 20 September 2024 13:35:43 -0400 (0:00:00.473) 0:00:29.093 ****** 32935 1726853743.95827: entering _queue_task() for managed_node1/shell 32935 1726853743.96486: worker is 1 (out of 1 available) 32935 1726853743.96496: exiting _queue_task() for managed_node1/shell 32935 1726853743.96507: done queuing things up, now waiting for results queue to drain 32935 1726853743.96509: waiting for pending results... 32935 1726853743.97147: running TaskExecutor() for managed_node1/TASK: Verify DNS and network connectivity 32935 1726853743.97162: in run() - task 02083763-bbaf-84df-441d-000000000b18 32935 1726853743.97165: variable 'ansible_search_path' from source: unknown 32935 1726853743.97167: variable 'ansible_search_path' from source: unknown 32935 1726853743.97260: calling self._execute() 32935 1726853743.97450: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853743.97495: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853743.97594: variable 'omit' from source: magic vars 32935 1726853743.98193: variable 'ansible_distribution_major_version' from source: facts 32935 1726853743.98212: Evaluated conditional (ansible_distribution_major_version != '6'): True 32935 1726853743.98387: variable 'ansible_facts' from source: unknown 32935 1726853743.99423: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): True 32935 1726853743.99519: variable 'omit' from source: magic vars 32935 1726853743.99523: variable 'omit' from source: magic vars 32935 1726853743.99526: variable 'omit' from source: magic vars 32935 1726853743.99577: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 32935 1726853743.99625: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 32935 1726853743.99658: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 32935 1726853743.99684: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853743.99702: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 32935 1726853743.99741: variable 'inventory_hostname' from source: host vars for 'managed_node1' 32935 1726853743.99751: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853743.99763: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853743.99895: Set connection var ansible_timeout to 10 32935 1726853743.99941: Set connection var ansible_shell_type to sh 32935 1726853743.99958: Set connection var ansible_pipelining to False 32935 1726853743.99986: Set connection var ansible_connection to ssh 32935 1726853743.99989: Set connection var ansible_shell_executable to /bin/sh 32935 1726853744.00089: Set connection var ansible_module_compression to ZIP_DEFLATED 32935 1726853744.00122: variable 'ansible_shell_executable' from source: unknown 32935 1726853744.00130: variable 'ansible_connection' from source: unknown 32935 1726853744.00169: variable 'ansible_module_compression' from source: unknown 32935 1726853744.00174: variable 'ansible_shell_type' from source: unknown 32935 1726853744.00176: variable 'ansible_shell_executable' from source: unknown 32935 1726853744.00178: variable 'ansible_host' from source: host vars for 'managed_node1' 32935 1726853744.00180: variable 'ansible_pipelining' from source: unknown 32935 1726853744.00182: variable 'ansible_timeout' from source: unknown 32935 1726853744.00184: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 32935 1726853744.00330: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32935 1726853744.00375: variable 'omit' from source: magic vars 32935 1726853744.00378: starting attempt loop 32935 1726853744.00385: running the handler 32935 1726853744.00388: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 32935 1726853744.00399: _low_level_execute_command(): starting 32935 1726853744.00426: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 32935 1726853744.01379: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32935 1726853744.01383: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853744.01385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853744.01486: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853744.01518: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853744.01542: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853744.01558: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853744.01639: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853744.03340: stdout chunk (state=3): >>>/root <<< 32935 1726853744.03481: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853744.03509: stdout chunk (state=3): >>><<< 32935 1726853744.03512: stderr chunk (state=3): >>><<< 32935 1726853744.03530: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853744.03549: _low_level_execute_command(): starting 32935 1726853744.03559: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853744.0353734-34272-232603784840731 `" && echo ansible-tmp-1726853744.0353734-34272-232603784840731="` echo /root/.ansible/tmp/ansible-tmp-1726853744.0353734-34272-232603784840731 `" ) && sleep 0' 32935 1726853744.04180: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32935 1726853744.04197: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853744.04216: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853744.04243: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32935 1726853744.04263: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 32935 1726853744.04284: stderr chunk (state=3): >>>debug2: match not found <<< 32935 1726853744.04306: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853744.04332: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration <<< 32935 1726853744.04385: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853744.04436: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853744.04456: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853744.04479: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853744.04676: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853744.06575: stdout chunk (state=3): >>>ansible-tmp-1726853744.0353734-34272-232603784840731=/root/.ansible/tmp/ansible-tmp-1726853744.0353734-34272-232603784840731 <<< 32935 1726853744.06712: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853744.06723: stdout chunk (state=3): >>><<< 32935 1726853744.06737: stderr chunk (state=3): >>><<< 32935 1726853744.06763: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853744.0353734-34272-232603784840731=/root/.ansible/tmp/ansible-tmp-1726853744.0353734-34272-232603784840731 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853744.06814: variable 'ansible_module_compression' from source: unknown 32935 1726853744.06875: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-32935vj31k4ae/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 32935 1726853744.07004: variable 'ansible_facts' from source: unknown 32935 1726853744.07007: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853744.0353734-34272-232603784840731/AnsiballZ_command.py 32935 1726853744.07288: Sending initial data 32935 1726853744.07485: Sent initial data (156 bytes) 32935 1726853744.07915: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853744.07918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 32935 1726853744.07920: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853744.07922: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853744.07924: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853744.07980: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853744.07991: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853744.08011: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853744.08080: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853744.09606: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 32935 1726853744.09667: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 32935 1726853744.09729: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-32935vj31k4ae/tmpn_z5c0zm /root/.ansible/tmp/ansible-tmp-1726853744.0353734-34272-232603784840731/AnsiballZ_command.py <<< 32935 1726853744.09743: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853744.0353734-34272-232603784840731/AnsiballZ_command.py" <<< 32935 1726853744.09781: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-32935vj31k4ae/tmpn_z5c0zm" to remote "/root/.ansible/tmp/ansible-tmp-1726853744.0353734-34272-232603784840731/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853744.0353734-34272-232603784840731/AnsiballZ_command.py" <<< 32935 1726853744.10474: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853744.10553: stderr chunk (state=3): >>><<< 32935 1726853744.10566: stdout chunk (state=3): >>><<< 32935 1726853744.10630: done transferring module to remote 32935 1726853744.10655: _low_level_execute_command(): starting 32935 1726853744.10669: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853744.0353734-34272-232603784840731/ /root/.ansible/tmp/ansible-tmp-1726853744.0353734-34272-232603784840731/AnsiballZ_command.py && sleep 0' 32935 1726853744.11292: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 32935 1726853744.11316: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853744.11319: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853744.11375: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32935 1726853744.11378: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 32935 1726853744.11381: stderr chunk (state=3): >>>debug2: match not found <<< 32935 1726853744.11383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853744.11385: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 32935 1726853744.11387: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address <<< 32935 1726853744.11389: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 32935 1726853744.11399: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 32935 1726853744.11407: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853744.11423: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 32935 1726853744.11433: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 32935 1726853744.11441: stderr chunk (state=3): >>>debug2: match found <<< 32935 1726853744.11452: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853744.11517: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853744.11533: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853744.11554: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853744.11641: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853744.13577: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853744.13583: stdout chunk (state=3): >>><<< 32935 1726853744.13586: stderr chunk (state=3): >>><<< 32935 1726853744.13589: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853744.13592: _low_level_execute_command(): starting 32935 1726853744.13595: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853744.0353734-34272-232603784840731/AnsiballZ_command.py && sleep 0' 32935 1726853744.14189: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853744.14241: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853744.14252: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853744.14272: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853744.14355: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853744.57465: stdout chunk (state=3): >>> {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 1387 0 --:--:-- --:--:-- --:--:-- 1392\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 8139 0 --:--:-- --:--:-- --:--:-- 8314", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 13:35:44.294805", "end": "2024-09-20 13:35:44.573149", "delta": "0:00:00.278344", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 32935 1726853744.59088: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 32935 1726853744.59118: stderr chunk (state=3): >>><<< 32935 1726853744.59121: stdout chunk (state=3): >>><<< 32935 1726853744.59141: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 1387 0 --:--:-- --:--:-- --:--:-- 1392\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 8139 0 --:--:-- --:--:-- --:--:-- 8314", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 13:35:44.294805", "end": "2024-09-20 13:35:44.573149", "delta": "0:00:00.278344", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 32935 1726853744.59178: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts "$host"; then\n echo FAILED to lookup host "$host"\n exit 1\n fi\n if ! curl -o /dev/null https://"$host"; then\n echo FAILED to contact host "$host"\n exit 1\n fi\ndone\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853744.0353734-34272-232603784840731/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 32935 1726853744.59186: _low_level_execute_command(): starting 32935 1726853744.59191: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853744.0353734-34272-232603784840731/ > /dev/null 2>&1 && sleep 0' 32935 1726853744.59639: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853744.59642: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 32935 1726853744.59644: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 32935 1726853744.59646: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 32935 1726853744.59648: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 32935 1726853744.59704: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 32935 1726853744.59707: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 32935 1726853744.59709: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 32935 1726853744.59751: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 32935 1726853744.61585: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 32935 1726853744.61614: stderr chunk (state=3): >>><<< 32935 1726853744.61617: stdout chunk (state=3): >>><<< 32935 1726853744.61630: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 32935 1726853744.61637: handler run complete 32935 1726853744.61654: Evaluated conditional (False): False 32935 1726853744.61663: attempt loop complete, returning result 32935 1726853744.61666: _execute() done 32935 1726853744.61669: dumping result to json 32935 1726853744.61676: done dumping result, returning 32935 1726853744.61683: done running TaskExecutor() for managed_node1/TASK: Verify DNS and network connectivity [02083763-bbaf-84df-441d-000000000b18] 32935 1726853744.61687: sending task result for task 02083763-bbaf-84df-441d-000000000b18 32935 1726853744.61791: done sending task result for task 02083763-bbaf-84df-441d-000000000b18 32935 1726853744.61794: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "delta": "0:00:00.278344", "end": "2024-09-20 13:35:44.573149", "rc": 0, "start": "2024-09-20 13:35:44.294805" } STDOUT: CHECK DNS AND CONNECTIVITY 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org STDERR: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 305 100 305 0 0 1387 0 --:--:-- --:--:-- --:--:-- 1392 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 291 100 291 0 0 8139 0 --:--:-- --:--:-- --:--:-- 8314 32935 1726853744.61879: no more pending results, returning what we have 32935 1726853744.61882: results queue empty 32935 1726853744.61884: checking for any_errors_fatal 32935 1726853744.61895: done checking for any_errors_fatal 32935 1726853744.61895: checking for max_fail_percentage 32935 1726853744.61897: done checking for max_fail_percentage 32935 1726853744.61898: checking to see if all hosts have failed and the running result is not ok 32935 1726853744.61899: done checking to see if all hosts have failed 32935 1726853744.61900: getting the remaining hosts for this loop 32935 1726853744.61902: done getting the remaining hosts for this loop 32935 1726853744.61905: getting the next task for host managed_node1 32935 1726853744.61913: done getting next task for host managed_node1 32935 1726853744.61917: ^ task is: TASK: meta (flush_handlers) 32935 1726853744.61919: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853744.61923: getting variables 32935 1726853744.61924: in VariableManager get_vars() 32935 1726853744.61964: Calling all_inventory to load vars for managed_node1 32935 1726853744.61967: Calling groups_inventory to load vars for managed_node1 32935 1726853744.61969: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853744.61987: Calling all_plugins_play to load vars for managed_node1 32935 1726853744.61989: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853744.61992: Calling groups_plugins_play to load vars for managed_node1 32935 1726853744.62869: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853744.63720: done with get_vars() 32935 1726853744.63736: done getting variables 32935 1726853744.63789: in VariableManager get_vars() 32935 1726853744.63799: Calling all_inventory to load vars for managed_node1 32935 1726853744.63800: Calling groups_inventory to load vars for managed_node1 32935 1726853744.63802: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853744.63805: Calling all_plugins_play to load vars for managed_node1 32935 1726853744.63806: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853744.63808: Calling groups_plugins_play to load vars for managed_node1 32935 1726853744.64431: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853744.65346: done with get_vars() 32935 1726853744.65364: done queuing things up, now waiting for results queue to drain 32935 1726853744.65365: results queue empty 32935 1726853744.65366: checking for any_errors_fatal 32935 1726853744.65368: done checking for any_errors_fatal 32935 1726853744.65369: checking for max_fail_percentage 32935 1726853744.65369: done checking for max_fail_percentage 32935 1726853744.65370: checking to see if all hosts have failed and the running result is not ok 32935 1726853744.65372: done checking to see if all hosts have failed 32935 1726853744.65373: getting the remaining hosts for this loop 32935 1726853744.65373: done getting the remaining hosts for this loop 32935 1726853744.65376: getting the next task for host managed_node1 32935 1726853744.65378: done getting next task for host managed_node1 32935 1726853744.65379: ^ task is: TASK: meta (flush_handlers) 32935 1726853744.65380: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853744.65382: getting variables 32935 1726853744.65382: in VariableManager get_vars() 32935 1726853744.65391: Calling all_inventory to load vars for managed_node1 32935 1726853744.65393: Calling groups_inventory to load vars for managed_node1 32935 1726853744.65394: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853744.65397: Calling all_plugins_play to load vars for managed_node1 32935 1726853744.65399: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853744.65400: Calling groups_plugins_play to load vars for managed_node1 32935 1726853744.66023: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853744.66844: done with get_vars() 32935 1726853744.66859: done getting variables 32935 1726853744.66893: in VariableManager get_vars() 32935 1726853744.66903: Calling all_inventory to load vars for managed_node1 32935 1726853744.66904: Calling groups_inventory to load vars for managed_node1 32935 1726853744.66906: Calling all_plugins_inventory to load vars for managed_node1 32935 1726853744.66908: Calling all_plugins_play to load vars for managed_node1 32935 1726853744.66914: Calling groups_plugins_inventory to load vars for managed_node1 32935 1726853744.66916: Calling groups_plugins_play to load vars for managed_node1 32935 1726853744.67584: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 32935 1726853744.68417: done with get_vars() 32935 1726853744.68434: done queuing things up, now waiting for results queue to drain 32935 1726853744.68435: results queue empty 32935 1726853744.68436: checking for any_errors_fatal 32935 1726853744.68437: done checking for any_errors_fatal 32935 1726853744.68437: checking for max_fail_percentage 32935 1726853744.68438: done checking for max_fail_percentage 32935 1726853744.68438: checking to see if all hosts have failed and the running result is not ok 32935 1726853744.68439: done checking to see if all hosts have failed 32935 1726853744.68439: getting the remaining hosts for this loop 32935 1726853744.68440: done getting the remaining hosts for this loop 32935 1726853744.68442: getting the next task for host managed_node1 32935 1726853744.68444: done getting next task for host managed_node1 32935 1726853744.68445: ^ task is: None 32935 1726853744.68446: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 32935 1726853744.68446: done queuing things up, now waiting for results queue to drain 32935 1726853744.68447: results queue empty 32935 1726853744.68447: checking for any_errors_fatal 32935 1726853744.68448: done checking for any_errors_fatal 32935 1726853744.68448: checking for max_fail_percentage 32935 1726853744.68449: done checking for max_fail_percentage 32935 1726853744.68449: checking to see if all hosts have failed and the running result is not ok 32935 1726853744.68449: done checking to see if all hosts have failed 32935 1726853744.68451: getting the next task for host managed_node1 32935 1726853744.68452: done getting next task for host managed_node1 32935 1726853744.68453: ^ task is: None 32935 1726853744.68453: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node1 : ok=79 changed=2 unreachable=0 failed=0 skipped=67 rescued=0 ignored=0 Friday 20 September 2024 13:35:44 -0400 (0:00:00.727) 0:00:29.820 ****** =============================================================================== fedora.linux_system_roles.network : Check which services are running ---- 2.05s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Gathering Facts --------------------------------------------------------- 1.87s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_vlan_mtu_nm.yml:6 fedora.linux_system_roles.network : Check which services are running ---- 1.78s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Create veth interface lsr101 -------------------------------------------- 1.37s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 fedora.linux_system_roles.network : Check which packages are installed --- 1.24s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gathering Facts --------------------------------------------------------- 1.21s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_vlan_mtu.yml:3 Install iproute --------------------------------------------------------- 0.90s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.88s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.87s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 fedora.linux_system_roles.network : Check which packages are installed --- 0.81s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.73s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.73s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Verify DNS and network connectivity ------------------------------------- 0.73s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Gather the minimum subset of ansible_facts required by the network role test --- 0.71s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Install iproute --------------------------------------------------------- 0.68s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Check if system is ostree ----------------------------------------------- 0.65s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Gather current interface info ------------------------------------------- 0.53s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Check routes and DNS ---------------------------------------------------- 0.47s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Set up veth as managed by NetworkManager -------------------------------- 0.46s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Stat profile file ------------------------------------------------------- 0.45s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 32935 1726853744.68548: RUNNING CLEANUP