[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 26764 1726882713.85580: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-Xyq executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 26764 1726882713.85865: Added group all to inventory 26764 1726882713.85867: Added group ungrouped to inventory 26764 1726882713.85870: Group all now contains ungrouped 26764 1726882713.85874: Examining possible inventory source: /tmp/network-91m/inventory.yml 26764 1726882713.94740: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 26764 1726882713.94787: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 26764 1726882713.94803: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 26764 1726882713.94842: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 26764 1726882713.94894: Loaded config def from plugin (inventory/script) 26764 1726882713.94896: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 26764 1726882713.94924: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 26764 1726882713.94981: Loaded config def from plugin (inventory/yaml) 26764 1726882713.94983: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 26764 1726882713.95043: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 26764 1726882713.95320: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 26764 1726882713.95323: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 26764 1726882713.95325: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 26764 1726882713.95330: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 26764 1726882713.95334: Loading data from /tmp/network-91m/inventory.yml 26764 1726882713.95378: /tmp/network-91m/inventory.yml was not parsable by auto 26764 1726882713.95421: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 26764 1726882713.95451: Loading data from /tmp/network-91m/inventory.yml 26764 1726882713.95506: group all already in inventory 26764 1726882713.95511: set inventory_file for managed_node1 26764 1726882713.95514: set inventory_dir for managed_node1 26764 1726882713.95514: Added host managed_node1 to inventory 26764 1726882713.95516: Added host managed_node1 to group all 26764 1726882713.95516: set ansible_host for managed_node1 26764 1726882713.95517: set ansible_ssh_extra_args for managed_node1 26764 1726882713.95519: set inventory_file for managed_node2 26764 1726882713.95521: set inventory_dir for managed_node2 26764 1726882713.95521: Added host managed_node2 to inventory 26764 1726882713.95522: Added host managed_node2 to group all 26764 1726882713.95523: set ansible_host for managed_node2 26764 1726882713.95523: set ansible_ssh_extra_args for managed_node2 26764 1726882713.95525: set inventory_file for managed_node3 26764 1726882713.95527: set inventory_dir for managed_node3 26764 1726882713.95527: Added host managed_node3 to inventory 26764 1726882713.95528: Added host managed_node3 to group all 26764 1726882713.95529: set ansible_host for managed_node3 26764 1726882713.95530: set ansible_ssh_extra_args for managed_node3 26764 1726882713.95531: Reconcile groups and hosts in inventory. 26764 1726882713.95534: Group ungrouped now contains managed_node1 26764 1726882713.95535: Group ungrouped now contains managed_node2 26764 1726882713.95536: Group ungrouped now contains managed_node3 26764 1726882713.95595: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 26764 1726882713.95678: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 26764 1726882713.95708: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 26764 1726882713.95725: Loaded config def from plugin (vars/host_group_vars) 26764 1726882713.95726: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 26764 1726882713.95731: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 26764 1726882713.95736: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 26764 1726882713.95771: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 26764 1726882713.96001: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882713.96060: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 26764 1726882713.96089: Loaded config def from plugin (connection/local) 26764 1726882713.96092: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 26764 1726882713.96425: Loaded config def from plugin (connection/paramiko_ssh) 26764 1726882713.96428: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 26764 1726882713.97017: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 26764 1726882713.97042: Loaded config def from plugin (connection/psrp) 26764 1726882713.97044: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 26764 1726882713.97453: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 26764 1726882713.97481: Loaded config def from plugin (connection/ssh) 26764 1726882713.97483: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 26764 1726882713.98683: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 26764 1726882713.98710: Loaded config def from plugin (connection/winrm) 26764 1726882713.98712: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 26764 1726882713.98732: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 26764 1726882713.98776: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 26764 1726882713.98820: Loaded config def from plugin (shell/cmd) 26764 1726882713.98822: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 26764 1726882713.98838: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 26764 1726882713.98878: Loaded config def from plugin (shell/powershell) 26764 1726882713.98879: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 26764 1726882713.98917: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 26764 1726882713.99023: Loaded config def from plugin (shell/sh) 26764 1726882713.99025: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 26764 1726882713.99047: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 26764 1726882713.99229: Loaded config def from plugin (become/runas) 26764 1726882713.99231: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 26764 1726882713.99342: Loaded config def from plugin (become/su) 26764 1726882713.99344: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 26764 1726882713.99439: Loaded config def from plugin (become/sudo) 26764 1726882713.99440: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 26764 1726882713.99467: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_reapply_nm.yml 26764 1726882713.99683: in VariableManager get_vars() 26764 1726882713.99697: done with get_vars() 26764 1726882713.99784: trying /usr/local/lib/python3.12/site-packages/ansible/modules 26764 1726882714.01700: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 26764 1726882714.01772: in VariableManager get_vars() 26764 1726882714.01775: done with get_vars() 26764 1726882714.01777: variable 'playbook_dir' from source: magic vars 26764 1726882714.01778: variable 'ansible_playbook_python' from source: magic vars 26764 1726882714.01778: variable 'ansible_config_file' from source: magic vars 26764 1726882714.01779: variable 'groups' from source: magic vars 26764 1726882714.01779: variable 'omit' from source: magic vars 26764 1726882714.01780: variable 'ansible_version' from source: magic vars 26764 1726882714.01780: variable 'ansible_check_mode' from source: magic vars 26764 1726882714.01781: variable 'ansible_diff_mode' from source: magic vars 26764 1726882714.01781: variable 'ansible_forks' from source: magic vars 26764 1726882714.01782: variable 'ansible_inventory_sources' from source: magic vars 26764 1726882714.01782: variable 'ansible_skip_tags' from source: magic vars 26764 1726882714.01783: variable 'ansible_limit' from source: magic vars 26764 1726882714.01783: variable 'ansible_run_tags' from source: magic vars 26764 1726882714.01783: variable 'ansible_verbosity' from source: magic vars 26764 1726882714.01805: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_reapply.yml 26764 1726882714.02018: in VariableManager get_vars() 26764 1726882714.02030: done with get_vars() 26764 1726882714.02082: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 26764 1726882714.02208: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 26764 1726882714.02288: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 26764 1726882714.02669: in VariableManager get_vars() 26764 1726882714.02683: done with get_vars() 26764 1726882714.02973: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 26764 1726882714.03058: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 26764 1726882714.03932: in VariableManager get_vars() 26764 1726882714.03943: done with get_vars() 26764 1726882714.03968: in VariableManager get_vars() 26764 1726882714.03978: done with get_vars() 26764 1726882714.04037: in VariableManager get_vars() 26764 1726882714.04047: done with get_vars() 26764 1726882714.04072: in VariableManager get_vars() 26764 1726882714.04082: done with get_vars() 26764 1726882714.04378: in VariableManager get_vars() 26764 1726882714.04389: done with get_vars() 26764 1726882714.04424: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 26764 1726882714.04432: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 26764 1726882714.04585: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 26764 1726882714.04680: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 26764 1726882714.04682: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-Xyq/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) 26764 1726882714.04702: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 26764 1726882714.04717: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 26764 1726882714.04819: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 26764 1726882714.04853: Loaded config def from plugin (callback/default) 26764 1726882714.04855: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 26764 1726882714.05658: Loaded config def from plugin (callback/junit) 26764 1726882714.05660: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 26764 1726882714.05693: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 26764 1726882714.05730: Loaded config def from plugin (callback/minimal) 26764 1726882714.05731: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 26764 1726882714.05759: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 26764 1726882714.05801: Loaded config def from plugin (callback/tree) 26764 1726882714.05806: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 26764 1726882714.05882: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 26764 1726882714.05884: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-Xyq/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_reapply_nm.yml ************************************************* 2 plays in /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_reapply_nm.yml 26764 1726882714.05901: in VariableManager get_vars() 26764 1726882714.05909: done with get_vars() 26764 1726882714.05912: in VariableManager get_vars() 26764 1726882714.05917: done with get_vars() 26764 1726882714.05919: variable 'omit' from source: magic vars 26764 1726882714.05941: in VariableManager get_vars() 26764 1726882714.05949: done with get_vars() 26764 1726882714.05961: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_reapply.yml' with nm as provider] ********** 26764 1726882714.07335: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 26764 1726882714.07397: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 26764 1726882714.07421: getting the remaining hosts for this loop 26764 1726882714.07422: done getting the remaining hosts for this loop 26764 1726882714.07424: getting the next task for host managed_node2 26764 1726882714.07426: done getting next task for host managed_node2 26764 1726882714.07428: ^ task is: TASK: Gathering Facts 26764 1726882714.07429: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882714.07430: getting variables 26764 1726882714.07431: in VariableManager get_vars() 26764 1726882714.07437: Calling all_inventory to load vars for managed_node2 26764 1726882714.07439: Calling groups_inventory to load vars for managed_node2 26764 1726882714.07440: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882714.07448: Calling all_plugins_play to load vars for managed_node2 26764 1726882714.07454: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882714.07456: Calling groups_plugins_play to load vars for managed_node2 26764 1726882714.07482: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882714.07520: done with get_vars() 26764 1726882714.07524: done getting variables 26764 1726882714.07570: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_reapply_nm.yml:6 Friday 20 September 2024 21:38:34 -0400 (0:00:00.017) 0:00:00.017 ****** 26764 1726882714.07585: entering _queue_task() for managed_node2/gather_facts 26764 1726882714.07585: Creating lock for gather_facts 26764 1726882714.07838: worker is 1 (out of 1 available) 26764 1726882714.07849: exiting _queue_task() for managed_node2/gather_facts 26764 1726882714.07862: done queuing things up, now waiting for results queue to drain 26764 1726882714.07868: waiting for pending results... 26764 1726882714.07999: running TaskExecutor() for managed_node2/TASK: Gathering Facts 26764 1726882714.08054: in run() - task 0e448fcc-3ce9-9875-c9a3-000000000037 26764 1726882714.08062: variable 'ansible_search_path' from source: unknown 26764 1726882714.08097: calling self._execute() 26764 1726882714.08140: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882714.08144: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882714.08160: variable 'omit' from source: magic vars 26764 1726882714.08224: variable 'omit' from source: magic vars 26764 1726882714.08244: variable 'omit' from source: magic vars 26764 1726882714.08271: variable 'omit' from source: magic vars 26764 1726882714.08309: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26764 1726882714.08339: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26764 1726882714.08357: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26764 1726882714.08377: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882714.08384: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882714.08408: variable 'inventory_hostname' from source: host vars for 'managed_node2' 26764 1726882714.08411: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882714.08414: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882714.08487: Set connection var ansible_shell_executable to /bin/sh 26764 1726882714.08490: Set connection var ansible_shell_type to sh 26764 1726882714.08499: Set connection var ansible_module_compression to ZIP_DEFLATED 26764 1726882714.08504: Set connection var ansible_timeout to 10 26764 1726882714.08508: Set connection var ansible_connection to ssh 26764 1726882714.08513: Set connection var ansible_pipelining to False 26764 1726882714.08529: variable 'ansible_shell_executable' from source: unknown 26764 1726882714.08532: variable 'ansible_connection' from source: unknown 26764 1726882714.08534: variable 'ansible_module_compression' from source: unknown 26764 1726882714.08536: variable 'ansible_shell_type' from source: unknown 26764 1726882714.08539: variable 'ansible_shell_executable' from source: unknown 26764 1726882714.08541: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882714.08545: variable 'ansible_pipelining' from source: unknown 26764 1726882714.08547: variable 'ansible_timeout' from source: unknown 26764 1726882714.08551: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882714.08683: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 26764 1726882714.08691: variable 'omit' from source: magic vars 26764 1726882714.08701: starting attempt loop 26764 1726882714.08703: running the handler 26764 1726882714.08714: variable 'ansible_facts' from source: unknown 26764 1726882714.08726: _low_level_execute_command(): starting 26764 1726882714.08732: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 26764 1726882714.09260: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882714.09281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882714.09294: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882714.09305: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882714.09361: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882714.09384: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882714.09490: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882714.11145: stdout chunk (state=3): >>>/root <<< 26764 1726882714.11249: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882714.11301: stderr chunk (state=3): >>><<< 26764 1726882714.11305: stdout chunk (state=3): >>><<< 26764 1726882714.11320: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882714.11330: _low_level_execute_command(): starting 26764 1726882714.11335: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882714.1131928-26767-163477697378553 `" && echo ansible-tmp-1726882714.1131928-26767-163477697378553="` echo /root/.ansible/tmp/ansible-tmp-1726882714.1131928-26767-163477697378553 `" ) && sleep 0' 26764 1726882714.11777: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882714.11794: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882714.11806: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882714.11819: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 26764 1726882714.11844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882714.11876: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882714.11888: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882714.11990: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882714.13865: stdout chunk (state=3): >>>ansible-tmp-1726882714.1131928-26767-163477697378553=/root/.ansible/tmp/ansible-tmp-1726882714.1131928-26767-163477697378553 <<< 26764 1726882714.13969: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882714.14011: stderr chunk (state=3): >>><<< 26764 1726882714.14014: stdout chunk (state=3): >>><<< 26764 1726882714.14027: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882714.1131928-26767-163477697378553=/root/.ansible/tmp/ansible-tmp-1726882714.1131928-26767-163477697378553 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882714.14051: variable 'ansible_module_compression' from source: unknown 26764 1726882714.14099: ANSIBALLZ: Using generic lock for ansible.legacy.setup 26764 1726882714.14105: ANSIBALLZ: Acquiring lock 26764 1726882714.14108: ANSIBALLZ: Lock acquired: 140693693673600 26764 1726882714.14111: ANSIBALLZ: Creating module 26764 1726882714.32239: ANSIBALLZ: Writing module into payload 26764 1726882714.32349: ANSIBALLZ: Writing module 26764 1726882714.32370: ANSIBALLZ: Renaming module 26764 1726882714.32375: ANSIBALLZ: Done creating module 26764 1726882714.32400: variable 'ansible_facts' from source: unknown 26764 1726882714.32406: variable 'inventory_hostname' from source: host vars for 'managed_node2' 26764 1726882714.32414: _low_level_execute_command(): starting 26764 1726882714.32419: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 26764 1726882714.32899: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882714.32912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882714.32932: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 26764 1726882714.32944: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 26764 1726882714.32953: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882714.33004: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882714.33015: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882714.33136: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882714.34795: stdout chunk (state=3): >>>PLATFORM <<< 26764 1726882714.34867: stdout chunk (state=3): >>>Linux <<< 26764 1726882714.34889: stdout chunk (state=3): >>>FOUND <<< 26764 1726882714.34907: stdout chunk (state=3): >>>/usr/bin/python3.9 /usr/bin/python3 /usr/bin/python3 <<< 26764 1726882714.34914: stdout chunk (state=3): >>>ENDFOUND <<< 26764 1726882714.35045: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882714.35093: stderr chunk (state=3): >>><<< 26764 1726882714.35097: stdout chunk (state=3): >>><<< 26764 1726882714.35110: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.9 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882714.35120 [managed_node2]: found interpreters: ['/usr/bin/python3.9', '/usr/bin/python3', '/usr/bin/python3'] 26764 1726882714.35155: _low_level_execute_command(): starting 26764 1726882714.35159: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 && sleep 0' 26764 1726882714.35235: Sending initial data 26764 1726882714.35244: Sent initial data (1181 bytes) 26764 1726882714.35606: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882714.35618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882714.35637: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 26764 1726882714.35649: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 26764 1726882714.35662: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882714.35707: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882714.35719: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882714.35824: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882714.39599: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"9\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"9\"\nPLATFORM_ID=\"platform:el9\"\nPRETTY_NAME=\"CentOS Stream 9\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:9\"\nHOME_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 9\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 26764 1726882714.39982: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882714.40024: stderr chunk (state=3): >>><<< 26764 1726882714.40028: stdout chunk (state=3): >>><<< 26764 1726882714.40038: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"9\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"9\"\nPLATFORM_ID=\"platform:el9\"\nPRETTY_NAME=\"CentOS Stream 9\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:9\"\nHOME_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 9\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882714.40096: variable 'ansible_facts' from source: unknown 26764 1726882714.40099: variable 'ansible_facts' from source: unknown 26764 1726882714.40107: variable 'ansible_module_compression' from source: unknown 26764 1726882714.40137: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-26764trh16hvb/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 26764 1726882714.40158: variable 'ansible_facts' from source: unknown 26764 1726882714.40267: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882714.1131928-26767-163477697378553/AnsiballZ_setup.py 26764 1726882714.40379: Sending initial data 26764 1726882714.40392: Sent initial data (154 bytes) 26764 1726882714.41036: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882714.41059: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882714.41077: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882714.41087: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882714.41131: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882714.41146: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882714.41246: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882714.43025: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 <<< 26764 1726882714.43030: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 26764 1726882714.43122: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 26764 1726882714.43222: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-26764trh16hvb/tmppncpqt_o /root/.ansible/tmp/ansible-tmp-1726882714.1131928-26767-163477697378553/AnsiballZ_setup.py <<< 26764 1726882714.43320: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 26764 1726882714.45294: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882714.45380: stderr chunk (state=3): >>><<< 26764 1726882714.45383: stdout chunk (state=3): >>><<< 26764 1726882714.45400: done transferring module to remote 26764 1726882714.45411: _low_level_execute_command(): starting 26764 1726882714.45414: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882714.1131928-26767-163477697378553/ /root/.ansible/tmp/ansible-tmp-1726882714.1131928-26767-163477697378553/AnsiballZ_setup.py && sleep 0' 26764 1726882714.45830: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882714.45842: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882714.45854: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 26764 1726882714.45873: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882714.45884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882714.45928: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882714.45940: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882714.46046: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882714.47876: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882714.47913: stderr chunk (state=3): >>><<< 26764 1726882714.47916: stdout chunk (state=3): >>><<< 26764 1726882714.47927: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882714.47930: _low_level_execute_command(): starting 26764 1726882714.47934: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882714.1131928-26767-163477697378553/AnsiballZ_setup.py && sleep 0' 26764 1726882714.48340: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882714.48352: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882714.48374: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 26764 1726882714.48386: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882714.48397: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882714.48443: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882714.48455: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882714.48567: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882714.50591: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # <<< 26764 1726882714.50603: stdout chunk (state=3): >>>import '_weakref' # <<< 26764 1726882714.50715: stdout chunk (state=3): >>>import '_io' # import 'marshal' # import 'posix' # <<< 26764 1726882714.50781: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 26764 1726882714.50822: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 26764 1726882714.50955: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' <<< 26764 1726882714.51001: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py <<< 26764 1726882714.51199: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f4d8dc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py <<< 26764 1726882714.51316: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f47d3a0> <<< 26764 1726882714.51345: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f4d8b20> <<< 26764 1726882714.51505: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' <<< 26764 1726882714.51510: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f4d8ac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' <<< 26764 1726882714.51513: stdout chunk (state=3): >>>import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f47d490> <<< 26764 1726882714.51524: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f47d940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f47d670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f434190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f434220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f457850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f434940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f495880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f42dd90> <<< 26764 1726882714.51573: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' <<< 26764 1726882714.51586: stdout chunk (state=3): >>>import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f457d90> <<< 26764 1726882714.51623: stdout chunk (state=3): >>>import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f47d970> <<< 26764 1726882714.51657: stdout chunk (state=3): >>>Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 26764 1726882714.51986: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py <<< 26764 1726882714.51992: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' <<< 26764 1726882714.52018: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py <<< 26764 1726882714.52031: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' <<< 26764 1726882714.52041: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py <<< 26764 1726882714.52062: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' <<< 26764 1726882714.52085: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py <<< 26764 1726882714.52095: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' <<< 26764 1726882714.52102: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f1aeeb0> <<< 26764 1726882714.52150: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f1b1f40> <<< 26764 1726882714.52177: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py <<< 26764 1726882714.52183: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' <<< 26764 1726882714.52195: stdout chunk (state=3): >>>import '_sre' # <<< 26764 1726882714.52218: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py <<< 26764 1726882714.52240: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc'<<< 26764 1726882714.52243: stdout chunk (state=3): >>> <<< 26764 1726882714.52257: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py <<< 26764 1726882714.52271: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' <<< 26764 1726882714.52282: stdout chunk (state=3): >>>import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f1a7610> <<< 26764 1726882714.52297: stdout chunk (state=3): >>>import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f1ad640> <<< 26764 1726882714.52305: stdout chunk (state=3): >>>import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f1ae370> <<< 26764 1726882714.52329: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py <<< 26764 1726882714.52404: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' <<< 26764 1726882714.52425: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py <<< 26764 1726882714.52449: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' <<< 26764 1726882714.52474: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py <<< 26764 1726882714.52478: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' <<< 26764 1726882714.52525: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f639f094dc0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f0948b0> <<< 26764 1726882714.52611: stdout chunk (state=3): >>>import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f094eb0> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' <<< 26764 1726882714.52635: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f094f70> <<< 26764 1726882714.52655: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f094e80> <<< 26764 1726882714.52681: stdout chunk (state=3): >>>import '_collections' # <<< 26764 1726882714.52742: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f189d30> <<< 26764 1726882714.52749: stdout chunk (state=3): >>>import '_functools' # <<< 26764 1726882714.52752: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f182610> <<< 26764 1726882714.53342: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f196670> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f1b5e20> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f639f0a6c70> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f189250> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f639f196280> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f1bb9d0> <<< 26764 1726882714.54341: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f0a6fa0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f0a6d90> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f0a6d00> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f079370> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f079460> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f0aefa0> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f0a8a30> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f0a8490> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639efc71c0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f064c70> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f0a8eb0> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f1bb040> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639efd9af0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f639efd9e20> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639efeb730> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639efebc70> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f639ef833a0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639efd9f10> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f639ef94280> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639efeb5b0> <<< 26764 1726882714.54351: stdout chunk (state=3): >>>import 'pwd' # <<< 26764 1726882714.54354: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f639ef94340> <<< 26764 1726882714.54357: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f0a69d0> <<< 26764 1726882714.54359: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py <<< 26764 1726882714.54361: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' <<< 26764 1726882714.54381: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f639efaf6a0> <<< 26764 1726882714.54383: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' <<< 26764 1726882714.54387: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' <<< 26764 1726882714.54406: stdout chunk (state=3): >>># extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f639efaf970> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639efaf760> <<< 26764 1726882714.54414: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f639efaf850> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 26764 1726882714.54530: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f639efafca0> <<< 26764 1726882714.54600: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f639efbc1f0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639efaf8e0> <<< 26764 1726882714.54618: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639efa3a30> <<< 26764 1726882714.54622: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f0a65b0> <<< 26764 1726882714.54632: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py <<< 26764 1726882714.54715: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' <<< 26764 1726882714.54739: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639efafa90> <<< 26764 1726882714.54859: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/cp437.pyc' <<< 26764 1726882714.54902: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f639eed3670> <<< 26764 1726882714.55151: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available <<< 26764 1726882714.55320: stdout chunk (state=3): >>># zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/__init__.py <<< 26764 1726882714.55343: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.56560: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.57448: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py <<< 26764 1726882714.57454: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639ee117c0> <<< 26764 1726882714.57478: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' <<< 26764 1726882714.57501: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' <<< 26764 1726882714.57523: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' <<< 26764 1726882714.57548: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' <<< 26764 1726882714.57553: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f639ee11160> <<< 26764 1726882714.57592: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639ee11280> <<< 26764 1726882714.57619: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639ee11f10> <<< 26764 1726882714.57641: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' <<< 26764 1726882714.57691: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639ee114f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639ee11d30> <<< 26764 1726882714.57695: stdout chunk (state=3): >>>import 'atexit' # <<< 26764 1726882714.57729: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' <<< 26764 1726882714.57738: stdout chunk (state=3): >>># extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f639ee11f70> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py <<< 26764 1726882714.57773: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' <<< 26764 1726882714.57804: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639ee11100> <<< 26764 1726882714.57831: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py <<< 26764 1726882714.57840: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' <<< 26764 1726882714.57860: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py <<< 26764 1726882714.57878: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' <<< 26764 1726882714.57899: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py <<< 26764 1726882714.57902: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' <<< 26764 1726882714.58013: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639ede6130> <<< 26764 1726882714.58041: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f639e7c70d0> <<< 26764 1726882714.58055: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f639e7c72b0> <<< 26764 1726882714.58098: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' <<< 26764 1726882714.58118: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639e7c7c40> <<< 26764 1726882714.58166: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639edf8dc0> <<< 26764 1726882714.58419: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639edf83a0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639edf8f70> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py <<< 26764 1726882714.58445: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' <<< 26764 1726882714.58470: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639ee46c10> <<< 26764 1726882714.58601: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639ee18cd0> <<< 26764 1726882714.58605: stdout chunk (state=3): >>>import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639ee183a0> <<< 26764 1726882714.58619: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639edc5b80> <<< 26764 1726882714.58622: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f639ee184c0> <<< 26764 1726882714.58635: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639ee184f0> <<< 26764 1726882714.58638: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py <<< 26764 1726882714.58642: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' <<< 26764 1726882714.58651: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py <<< 26764 1726882714.58685: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' <<< 26764 1726882714.58781: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f639ed48250> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639ee581f0> <<< 26764 1726882714.58785: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' <<< 26764 1726882714.58833: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' <<< 26764 1726882714.58841: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f639ed558e0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639ee58370> <<< 26764 1726882714.59007: stdout chunk (state=3): >>># /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py <<< 26764 1726882714.59117: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639ee58ca0> <<< 26764 1726882714.59129: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639ed55880> <<< 26764 1726882714.59206: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' <<< 26764 1726882714.59247: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f639ed488b0> <<< 26764 1726882714.59279: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f639edf1190> <<< 26764 1726882714.59283: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' <<< 26764 1726882714.59292: stdout chunk (state=3): >>># extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f639ee58670> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639ee518b0> <<< 26764 1726882714.59312: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py <<< 26764 1726882714.59330: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' <<< 26764 1726882714.59351: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py <<< 26764 1726882714.59371: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' <<< 26764 1726882714.59397: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' <<< 26764 1726882714.59413: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f639ed4a9d0> <<< 26764 1726882714.59603: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' <<< 26764 1726882714.59750: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f639ed67b80> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639ed54640> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f639ed4af70> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639ed54a30> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available <<< 26764 1726882714.59802: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.59961: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available <<< 26764 1726882714.60054: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.60161: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.60517: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.60977: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/six/__init__.py <<< 26764 1726882714.60994: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # <<< 26764 1726882714.61013: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py <<< 26764 1726882714.61037: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' <<< 26764 1726882714.61077: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' <<< 26764 1726882714.61127: stdout chunk (state=3): >>># extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f639ed907c0> <<< 26764 1726882714.61155: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639ed95820> <<< 26764 1726882714.61179: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639e3ba9a0> <<< 26764 1726882714.61210: stdout chunk (state=3): >>>import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/selinux.py <<< 26764 1726882714.61230: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.61255: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.61275: stdout chunk (state=3): >>>import ansible.module_utils._text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available <<< 26764 1726882714.61372: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.61499: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' <<< 26764 1726882714.61523: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639edcf760> # zipimport: zlib available <<< 26764 1726882714.61919: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.62280: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.62335: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.62396: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/collections.py <<< 26764 1726882714.62407: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.62444: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.62477: stdout chunk (state=3): >>>import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available <<< 26764 1726882714.62528: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.62613: stdout chunk (state=3): >>>import ansible.module_utils.errors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/errors.py <<< 26764 1726882714.62634: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available <<< 26764 1726882714.62673: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.62709: stdout chunk (state=3): >>>import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/convert_bool.py <<< 26764 1726882714.62719: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.62904: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.63178: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # <<< 26764 1726882714.63211: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639ee143d0> <<< 26764 1726882714.63214: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.63257: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.63348: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/arg_spec.py <<< 26764 1726882714.63379: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.63394: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.63436: stdout chunk (state=3): >>>import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available <<< 26764 1726882714.63477: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.63505: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.63602: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.63656: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py <<< 26764 1726882714.63686: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' <<< 26764 1726882714.63757: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f639ed879a0> <<< 26764 1726882714.63854: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639e236be0> <<< 26764 1726882714.63898: stdout chunk (state=3): >>>import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available <<< 26764 1726882714.63948: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.64016: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.64031: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.64087: stdout chunk (state=3): >>># /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py <<< 26764 1726882714.64142: stdout chunk (state=3): >>># code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' <<< 26764 1726882714.64145: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' <<< 26764 1726882714.64171: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' <<< 26764 1726882714.64254: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639ed98670> <<< 26764 1726882714.64288: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639ede3d90> <<< 26764 1726882714.64361: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639ee14400> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available <<< 26764 1726882714.64403: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.64406: stdout chunk (state=3): >>>import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/sys_info.py <<< 26764 1726882714.64504: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/basic.py <<< 26764 1726882714.64527: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/modules/__init__.py <<< 26764 1726882714.64531: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.64556: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.64639: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 26764 1726882714.64652: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.64683: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.64722: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.64746: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.64782: stdout chunk (state=3): >>>import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/namespace.py # zipimport: zlib available <<< 26764 1726882714.64856: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.64927: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.64940: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.64979: stdout chunk (state=3): >>>import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available <<< 26764 1726882714.65114: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.65254: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.65289: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.65325: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' <<< 26764 1726882714.65394: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py <<< 26764 1726882714.65407: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' <<< 26764 1726882714.65437: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639e3e3ac0> # /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' <<< 26764 1726882714.65458: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py <<< 26764 1726882714.65489: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' <<< 26764 1726882714.65533: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py # code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' <<< 26764 1726882714.65544: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639e39da90> <<< 26764 1726882714.65574: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f639e39da00> <<< 26764 1726882714.65639: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639e3cf760> <<< 26764 1726882714.65669: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639e3e3190> <<< 26764 1726882714.65675: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639e13af10> <<< 26764 1726882714.65707: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639e13aaf0> <<< 26764 1726882714.65711: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py <<< 26764 1726882714.65743: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py <<< 26764 1726882714.65748: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' <<< 26764 1726882714.65806: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f639edf4cd0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639e37feb0> <<< 26764 1726882714.65831: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py <<< 26764 1726882714.65846: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' <<< 26764 1726882714.65860: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639edf42e0> # /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py <<< 26764 1726882714.65898: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' <<< 26764 1726882714.65911: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f639e1a2fa0> <<< 26764 1726882714.65947: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639e3ccdc0> <<< 26764 1726882714.65993: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639e13adc0> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/timeout.py import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/collector.py # zipimport: zlib available <<< 26764 1726882714.66014: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/__init__.py # zipimport: zlib available <<< 26764 1726882714.66061: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.66119: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/facter.py # zipimport: zlib available <<< 26764 1726882714.66222: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/ohai.py <<< 26764 1726882714.66242: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/__init__.py # zipimport: zlib available <<< 26764 1726882714.66283: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.66312: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/apparmor.py # zipimport: zlib available <<< 26764 1726882714.66344: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.66390: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/caps.py <<< 26764 1726882714.66401: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.66429: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.66472: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/chroot.py <<< 26764 1726882714.66484: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.66530: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.66584: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.66619: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.66683: stdout chunk (state=3): >>>import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/cmdline.py <<< 26764 1726882714.66695: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.67081: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.67437: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available <<< 26764 1726882714.67496: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.67532: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.67554: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.67601: stdout chunk (state=3): >>>import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/date_time.py # zipimport: zlib available <<< 26764 1726882714.67623: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.67649: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/env.py # zipimport: zlib available <<< 26764 1726882714.67709: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.67756: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/dns.py <<< 26764 1726882714.67780: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.67811: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.67841: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available <<< 26764 1726882714.67932: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.68007: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/loadavg.py # zipimport: zlib available # zipimport: zlib available <<< 26764 1726882714.68021: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' <<< 26764 1726882714.68057: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639e3bf670> # /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py <<< 26764 1726882714.68101: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' <<< 26764 1726882714.68243: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639e0baf10> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/local.py # zipimport: zlib available <<< 26764 1726882714.68306: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.68376: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/lsb.py <<< 26764 1726882714.68379: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.68445: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.68690: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/platform.py # zipimport: zlib available # zipimport: zlib available <<< 26764 1726882714.68739: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py <<< 26764 1726882714.68756: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' <<< 26764 1726882714.68922: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f639e0a9c10> <<< 26764 1726882714.69155: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639e0f9b20> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/python.py # zipimport: zlib available <<< 26764 1726882714.69210: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.69245: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/selinux.py # zipimport: zlib available <<< 26764 1726882714.69327: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.69386: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.69485: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.69616: stdout chunk (state=3): >>>import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py # zipimport: zlib available <<< 26764 1726882714.69662: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.69695: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py # zipimport: zlib available <<< 26764 1726882714.69721: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.69816: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' <<< 26764 1726882714.69861: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f639e0354f0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639e035a30> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py # zipimport: zlib available <<< 26764 1726882714.69893: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.69940: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/base.py # zipimport: zlib available <<< 26764 1726882714.70079: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.70224: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/aix.py <<< 26764 1726882714.70235: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.70298: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.70366: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.70397: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.70525: stdout chunk (state=3): >>>import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py <<< 26764 1726882714.70546: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.70590: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 26764 1726882714.70669: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.70797: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py # zipimport: zlib available <<< 26764 1726882714.70907: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.71006: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py <<< 26764 1726882714.71018: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.71047: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.71066: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.71499: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.71924: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py <<< 26764 1726882714.71927: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.72005: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.72100: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py # zipimport: zlib available <<< 26764 1726882714.72187: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.72282: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py <<< 26764 1726882714.72285: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.72399: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.72557: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/__init__.py <<< 26764 1726882714.72573: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.72597: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.72643: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/base.py # zipimport: zlib available <<< 26764 1726882714.72749: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.72813: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.72987: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.73159: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/aix.py <<< 26764 1726882714.73162: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.73189: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.73270: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/darwin.py # zipimport: zlib available # zipimport: zlib available <<< 26764 1726882714.73293: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available <<< 26764 1726882714.73354: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.73399: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py # zipimport: zlib available <<< 26764 1726882714.73446: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.73467: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available <<< 26764 1726882714.73501: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.73577: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hpux.py # zipimport: zlib available <<< 26764 1726882714.73612: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.73657: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hurd.py # zipimport: zlib available <<< 26764 1726882714.73873: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.74112: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/linux.py <<< 26764 1726882714.74115: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.74146: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.74209: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/iscsi.py <<< 26764 1726882714.74215: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.74248: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.74281: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/nvme.py <<< 26764 1726882714.74285: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.74306: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.74346: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/netbsd.py # zipimport: zlib available <<< 26764 1726882714.74379: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.74405: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/openbsd.py <<< 26764 1726882714.74420: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.74483: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.74566: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/sunos.py <<< 26764 1726882714.74588: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py # zipimport: zlib available <<< 26764 1726882714.74626: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.74687: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/base.py <<< 26764 1726882714.74700: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.74719: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 26764 1726882714.74757: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.74806: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.74855: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.74943: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py <<< 26764 1726882714.74946: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.74978: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.75033: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py # zipimport: zlib available <<< 26764 1726882714.75193: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.75366: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/linux.py <<< 26764 1726882714.75371: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.75395: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.75446: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py # zipimport: zlib available <<< 26764 1726882714.75484: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.75532: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py <<< 26764 1726882714.75535: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.75609: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.75677: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/default_collectors.py <<< 26764 1726882714.75700: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.75783: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.75843: stdout chunk (state=3): >>>import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/ansible_collector.py <<< 26764 1726882714.75846: stdout chunk (state=3): >>>import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/__init__.py <<< 26764 1726882714.75917: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882714.76132: stdout chunk (state=3): >>>import 'gc' # <<< 26764 1726882714.76565: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' <<< 26764 1726882714.76601: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py # code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' <<< 26764 1726882714.76639: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f639de650a0> <<< 26764 1726882714.76654: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639de658e0> <<< 26764 1726882714.76717: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639de65e20> <<< 26764 1726882714.80545: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/queues.py <<< 26764 1726882714.80570: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc' <<< 26764 1726882714.80597: stdout chunk (state=3): >>>import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639de65fa0> # /usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/synchronize.py <<< 26764 1726882714.80615: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc' <<< 26764 1726882714.80628: stdout chunk (state=3): >>>import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639e080280> <<< 26764 1726882714.80687: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc' <<< 26764 1726882714.80729: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639e0281f0> <<< 26764 1726882714.80749: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639e028640> <<< 26764 1726882714.81044: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 26764 1726882715.05158: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_is_chroot": false, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALEARW5ZJ51XTLSDuUsPojumVU0f1DmiQsXjMOap4QLlljOiysapjSUe6pZOyAdiI/KfARhDoOFvlC07kCLCcs7DDk8JxBZpsM0D55SdDlfwsB3FVgWNP+9by8G6kzbePHWdZyyWlAuavj4OAEwAjpWpP8/daus0ha4xywlVVoKjAAAAFQCbiW4bR+tgMvjrxC198dqI1mTbjQAAAIBzCzkJTtnGDKfOHq2dFI5cUEuaj1PgRot3wyaXENzUjZVnIFgXUmgKDCxO+EAtU6uAkBPQF4XNgiuaw5bavYpZxcJ4WIpM4ZDRoSkc7BBbJPRLZ45GfrHJwgqAmAZ3RSvVqeXE4WKQHLm43/eDHewgPqqqWe6QVuQH5SEe79yk3wAAAIEArG+AuupiAeoVJ9Lh36QMj4kRo5pTASh2eD5MqSOdy39UhsXbWBcj3JCIvNk/nwep/9neGyRZ5t5wT05dRX80vlgZJX65hrbepO+lqC3wlng+6GQ34D7TJKYnvEkR3neE0+06kx5R6IRWZf1YQV6fMQhx8AJ2JmvnLFicmYlkhQQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDND+RJCrYgIUzolo5fZ64Ey6cksefKDUWmGDjsqVTmuT3HrlDyUZOro4JAnUQBmiamXsJUFbrFdJAVpukD4yyowqCQLr0ZFuKNEzrt5CObrtWflOskKynO3kaoU0WhDkqIbwS2j/+NxBCxgDGqd/5Os3cOMv3eyjUElz6xoI4zsmGMfxVYmT+/SHBfoyxyqY8Hw2Ooq+H5L9OlYgV4hqu7kKPpM1THUJTjy47m6qvws5gztclLjPA1KIW2Dz6kKzUYspNJcoS2sK1xFvL7mBjpGAP7WhXVH2n5ySenQ24Z6mEj+tG2f11rjPpjCUjDzzciGCWiRDZWBLm/GGmQXJJ8zAY<<< 26764 1726882715.05209: stdout chunk (state=3): >>>nw82yIUKqufLrr1wmcXICPMVj9pFjXSoBWe/yhX9E87w7YD5HWsUrgrLdSctdV4QYy+R5g9ERi7FjwbRsuZ04BihZs70+f/29hUzuc6MA87KVovGT0Uc7GVC7bx8NLt0bTBsbydlONVHVQuol/YEpQrQophDvmBfh+PgMDH8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOEITn1vyppR+Moe1UdR0WGPhUnQ/dwHNcNi0OYy21LkBQ5jsxOPLvZ+C2MbRYlz2afs4nYYIV8E0AuK6aRks3w=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKEdFOHVk9tX1R+zEyLVdxS/U5QeeeFYWSnUmjpXlpt7", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "38", "second": "34", "epoch": "1726882714", "epoch_int": "1726882714", "date": "2024-09-20", "time": "21:38:34", "iso8601_micro": "2024-09-21T01:38:34.763889Z", "iso8601": "2024-09-21T01:38:34Z", "iso8601_basic": "20240920T213834763889", "iso8601_basic_short": "20240920T213834", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-158", "ansible_nodename": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "21e18164a0c64d0daed004bd8a1b67b7", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_lsb": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fips": false, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_local": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2804, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 728, "free": 2804}, "nocache": {"free": 3267, "used": 265}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2e6858-9a88-b36a-7765-70992ab591a7", "ansible_product_uuid": "ec2e6858-9a88-b36a-7765-70992ab591a7", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 653, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264238518272, "block_size": 4096, "block_total": 65519355, "block_available": 64511357, "block_used": 1007998, "inode_total": 131071472, "inode_available": 130998694, "inode_used": 72778, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:4f:68:7a:de:b1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.158", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::104f:68ff:fe7a:deb1", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.158", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:4f:68:7a:de:b1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.158"], "ansible_all_ipv6_addresses": ["fe80::104f:68ff:fe7a:deb1"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.158", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::104f:68ff:fe7a:deb1"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_service_mgr": "systemd", "ansible_loadavg": {"1m": 0.44, "5m": 0.41, "15m": 0.24}, "ansible_fibre_channel_wwn": [], "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 33528 10.31.11.158 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 33528 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_iscsi_iqn": "", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 26764 1726882715.05726: stdout chunk (state=3): >>># clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat <<< 26764 1726882715.05919: stdout chunk (state=3): >>># cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible <<< 26764 1726882715.06179: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes<<< 26764 1726882715.06231: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing gc # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 26764 1726882715.06518: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 26764 1726882715.06545: stdout chunk (state=3): >>># destroy importlib.util # destroy importlib.abc # destroy importlib.machinery <<< 26764 1726882715.06570: stdout chunk (state=3): >>># destroy zipimport # destroy _compression <<< 26764 1726882715.06619: stdout chunk (state=3): >>># destroy binascii # destroy importlib # destroy bz2 # destroy lzma <<< 26764 1726882715.06624: stdout chunk (state=3): >>># destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings <<< 26764 1726882715.06646: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 26764 1726882715.06690: stdout chunk (state=3): >>># destroy selinux # destroy distro # destroy logging # destroy argparse <<< 26764 1726882715.06754: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle <<< 26764 1726882715.06800: stdout chunk (state=3): >>># destroy queue # destroy multiprocessing.reduction <<< 26764 1726882715.06808: stdout chunk (state=3): >>># destroy shlex # destroy datetime <<< 26764 1726882715.06852: stdout chunk (state=3): >>># destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json <<< 26764 1726882715.06895: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection <<< 26764 1726882715.06898: stdout chunk (state=3): >>># destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array # destroy multiprocessing.dummy.connection <<< 26764 1726882715.06992: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping gc # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle <<< 26764 1726882715.07049: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib <<< 26764 1726882715.07115: stdout chunk (state=3): >>># cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re <<< 26764 1726882715.07153: stdout chunk (state=3): >>># destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins <<< 26764 1726882715.07203: stdout chunk (state=3): >>># destroy unicodedata # destroy gc # destroy termios <<< 26764 1726882715.07206: stdout chunk (state=3): >>># destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal <<< 26764 1726882715.07399: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy _sre # destroy sre_parse <<< 26764 1726882715.07500: stdout chunk (state=3): >>># destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator <<< 26764 1726882715.07506: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks <<< 26764 1726882715.07799: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 26764 1726882715.07879: stderr chunk (state=3): >>><<< 26764 1726882715.07882: stdout chunk (state=3): >>><<< 26764 1726882715.08135: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f4d8dc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f47d3a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f4d8b20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f4d8ac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f47d490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f47d940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f47d670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f434190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f434220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f457850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f434940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f495880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f42dd90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f457d90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f47d970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f1aeeb0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f1b1f40> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f1a7610> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f1ad640> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f1ae370> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f639f094dc0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f0948b0> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f094eb0> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f094f70> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f094e80> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f189d30> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f182610> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f196670> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f1b5e20> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f639f0a6c70> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f189250> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f639f196280> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f1bb9d0> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f0a6fa0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f0a6d90> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f0a6d00> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f079370> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f079460> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f0aefa0> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f0a8a30> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f0a8490> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639efc71c0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f064c70> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f0a8eb0> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f1bb040> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639efd9af0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f639efd9e20> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639efeb730> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639efebc70> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f639ef833a0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639efd9f10> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f639ef94280> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639efeb5b0> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f639ef94340> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f0a69d0> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f639efaf6a0> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f639efaf970> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639efaf760> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f639efaf850> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f639efafca0> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f639efbc1f0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639efaf8e0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639efa3a30> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639f0a65b0> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639efafa90> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f639eed3670> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639ee117c0> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f639ee11160> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639ee11280> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639ee11f10> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639ee114f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639ee11d30> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f639ee11f70> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639ee11100> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639ede6130> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f639e7c70d0> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f639e7c72b0> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639e7c7c40> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639edf8dc0> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639edf83a0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639edf8f70> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639ee46c10> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639ee18cd0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639ee183a0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639edc5b80> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f639ee184c0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639ee184f0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f639ed48250> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639ee581f0> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f639ed558e0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639ee58370> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639ee58ca0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639ed55880> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f639ed488b0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f639edf1190> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f639ee58670> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639ee518b0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f639ed4a9d0> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f639ed67b80> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639ed54640> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f639ed4af70> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639ed54a30> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f639ed907c0> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639ed95820> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639e3ba9a0> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639edcf760> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639ee143d0> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f639ed879a0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639e236be0> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639ed98670> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639ede3d90> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639ee14400> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/namespace.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639e3e3ac0> # /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py # code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py # code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639e39da90> # extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f639e39da00> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639e3cf760> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639e3e3190> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639e13af10> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639e13aaf0> # /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' # extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f639edf4cd0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639e37feb0> # /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639edf42e0> # /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f639e1a2fa0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639e3ccdc0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639e13adc0> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/timeout.py import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/collector.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/facter.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/ohai.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/apparmor.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/caps.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/chroot.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/cmdline.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/date_time.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/env.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/dns.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/loadavg.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639e3bf670> # /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py # code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639e0baf10> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/local.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/lsb.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/platform.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py # code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f639e0a9c10> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639e0f9b20> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/python.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/selinux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f639e0354f0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639e035a30> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/base.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/aix.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/aix.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/darwin.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/iscsi.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/nvme.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/default_collectors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_wvf0gcuu/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/__init__.py # zipimport: zlib available import 'gc' # # /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py # code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f639de650a0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639de658e0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639de65e20> # /usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/queues.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639de65fa0> # /usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639e080280> # /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639e0281f0> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f639e028640> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_is_chroot": false, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALEARW5ZJ51XTLSDuUsPojumVU0f1DmiQsXjMOap4QLlljOiysapjSUe6pZOyAdiI/KfARhDoOFvlC07kCLCcs7DDk8JxBZpsM0D55SdDlfwsB3FVgWNP+9by8G6kzbePHWdZyyWlAuavj4OAEwAjpWpP8/daus0ha4xywlVVoKjAAAAFQCbiW4bR+tgMvjrxC198dqI1mTbjQAAAIBzCzkJTtnGDKfOHq2dFI5cUEuaj1PgRot3wyaXENzUjZVnIFgXUmgKDCxO+EAtU6uAkBPQF4XNgiuaw5bavYpZxcJ4WIpM4ZDRoSkc7BBbJPRLZ45GfrHJwgqAmAZ3RSvVqeXE4WKQHLm43/eDHewgPqqqWe6QVuQH5SEe79yk3wAAAIEArG+AuupiAeoVJ9Lh36QMj4kRo5pTASh2eD5MqSOdy39UhsXbWBcj3JCIvNk/nwep/9neGyRZ5t5wT05dRX80vlgZJX65hrbepO+lqC3wlng+6GQ34D7TJKYnvEkR3neE0+06kx5R6IRWZf1YQV6fMQhx8AJ2JmvnLFicmYlkhQQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDND+RJCrYgIUzolo5fZ64Ey6cksefKDUWmGDjsqVTmuT3HrlDyUZOro4JAnUQBmiamXsJUFbrFdJAVpukD4yyowqCQLr0ZFuKNEzrt5CObrtWflOskKynO3kaoU0WhDkqIbwS2j/+NxBCxgDGqd/5Os3cOMv3eyjUElz6xoI4zsmGMfxVYmT+/SHBfoyxyqY8Hw2Ooq+H5L9OlYgV4hqu7kKPpM1THUJTjy47m6qvws5gztclLjPA1KIW2Dz6kKzUYspNJcoS2sK1xFvL7mBjpGAP7WhXVH2n5ySenQ24Z6mEj+tG2f11rjPpjCUjDzzciGCWiRDZWBLm/GGmQXJJ8zAYnw82yIUKqufLrr1wmcXICPMVj9pFjXSoBWe/yhX9E87w7YD5HWsUrgrLdSctdV4QYy+R5g9ERi7FjwbRsuZ04BihZs70+f/29hUzuc6MA87KVovGT0Uc7GVC7bx8NLt0bTBsbydlONVHVQuol/YEpQrQophDvmBfh+PgMDH8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOEITn1vyppR+Moe1UdR0WGPhUnQ/dwHNcNi0OYy21LkBQ5jsxOPLvZ+C2MbRYlz2afs4nYYIV8E0AuK6aRks3w=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKEdFOHVk9tX1R+zEyLVdxS/U5QeeeFYWSnUmjpXlpt7", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "38", "second": "34", "epoch": "1726882714", "epoch_int": "1726882714", "date": "2024-09-20", "time": "21:38:34", "iso8601_micro": "2024-09-21T01:38:34.763889Z", "iso8601": "2024-09-21T01:38:34Z", "iso8601_basic": "20240920T213834763889", "iso8601_basic_short": "20240920T213834", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-158", "ansible_nodename": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "21e18164a0c64d0daed004bd8a1b67b7", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_lsb": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fips": false, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_local": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2804, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 728, "free": 2804}, "nocache": {"free": 3267, "used": 265}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2e6858-9a88-b36a-7765-70992ab591a7", "ansible_product_uuid": "ec2e6858-9a88-b36a-7765-70992ab591a7", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 653, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264238518272, "block_size": 4096, "block_total": 65519355, "block_available": 64511357, "block_used": 1007998, "inode_total": 131071472, "inode_available": 130998694, "inode_used": 72778, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:4f:68:7a:de:b1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.158", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::104f:68ff:fe7a:deb1", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.158", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:4f:68:7a:de:b1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.158"], "ansible_all_ipv6_addresses": ["fe80::104f:68ff:fe7a:deb1"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.158", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::104f:68ff:fe7a:deb1"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_service_mgr": "systemd", "ansible_loadavg": {"1m": 0.44, "5m": 0.41, "15m": 0.24}, "ansible_fibre_channel_wwn": [], "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 33528 10.31.11.158 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 33528 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_iscsi_iqn": "", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing gc # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle # destroy queue # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping gc # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy unicodedata # destroy gc # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. [WARNING]: Module invocation had junk after the JSON data: # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing gc # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle # destroy queue # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping gc # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy unicodedata # destroy gc # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks [WARNING]: Platform linux on host managed_node2 is using the discovered Python interpreter at /usr/bin/python3.9, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 26764 1726882715.09998: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882714.1131928-26767-163477697378553/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 26764 1726882715.10001: _low_level_execute_command(): starting 26764 1726882715.10003: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882714.1131928-26767-163477697378553/ > /dev/null 2>&1 && sleep 0' 26764 1726882715.11681: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882715.11684: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882715.11832: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882715.11840: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882715.11842: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882715.11909: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882715.11973: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882715.12142: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882715.14008: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882715.14077: stderr chunk (state=3): >>><<< 26764 1726882715.14080: stdout chunk (state=3): >>><<< 26764 1726882715.14371: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882715.14374: handler run complete 26764 1726882715.14377: variable 'ansible_facts' from source: unknown 26764 1726882715.14379: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882715.14649: variable 'ansible_facts' from source: unknown 26764 1726882715.14745: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882715.15266: attempt loop complete, returning result 26764 1726882715.15357: _execute() done 26764 1726882715.15368: dumping result to json 26764 1726882715.15404: done dumping result, returning 26764 1726882715.15416: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [0e448fcc-3ce9-9875-c9a3-000000000037] 26764 1726882715.15469: sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000037 26764 1726882715.16698: done sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000037 ok: [managed_node2] 26764 1726882715.16716: WORKER PROCESS EXITING 26764 1726882715.16830: no more pending results, returning what we have 26764 1726882715.16837: results queue empty 26764 1726882715.16838: checking for any_errors_fatal 26764 1726882715.16840: done checking for any_errors_fatal 26764 1726882715.16840: checking for max_fail_percentage 26764 1726882715.16842: done checking for max_fail_percentage 26764 1726882715.16843: checking to see if all hosts have failed and the running result is not ok 26764 1726882715.16844: done checking to see if all hosts have failed 26764 1726882715.16845: getting the remaining hosts for this loop 26764 1726882715.16846: done getting the remaining hosts for this loop 26764 1726882715.16850: getting the next task for host managed_node2 26764 1726882715.16857: done getting next task for host managed_node2 26764 1726882715.16859: ^ task is: TASK: meta (flush_handlers) 26764 1726882715.16861: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882715.16868: getting variables 26764 1726882715.16870: in VariableManager get_vars() 26764 1726882715.16896: Calling all_inventory to load vars for managed_node2 26764 1726882715.16899: Calling groups_inventory to load vars for managed_node2 26764 1726882715.16902: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882715.16913: Calling all_plugins_play to load vars for managed_node2 26764 1726882715.16923: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882715.16927: Calling groups_plugins_play to load vars for managed_node2 26764 1726882715.17102: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882715.17616: done with get_vars() 26764 1726882715.17626: done getting variables 26764 1726882715.17928: in VariableManager get_vars() 26764 1726882715.17938: Calling all_inventory to load vars for managed_node2 26764 1726882715.17940: Calling groups_inventory to load vars for managed_node2 26764 1726882715.17942: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882715.17947: Calling all_plugins_play to load vars for managed_node2 26764 1726882715.17949: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882715.17952: Calling groups_plugins_play to load vars for managed_node2 26764 1726882715.18385: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882715.19356: done with get_vars() 26764 1726882715.19375: done queuing things up, now waiting for results queue to drain 26764 1726882715.19378: results queue empty 26764 1726882715.19379: checking for any_errors_fatal 26764 1726882715.19381: done checking for any_errors_fatal 26764 1726882715.19382: checking for max_fail_percentage 26764 1726882715.19383: done checking for max_fail_percentage 26764 1726882715.19391: checking to see if all hosts have failed and the running result is not ok 26764 1726882715.19392: done checking to see if all hosts have failed 26764 1726882715.19393: getting the remaining hosts for this loop 26764 1726882715.19394: done getting the remaining hosts for this loop 26764 1726882715.19396: getting the next task for host managed_node2 26764 1726882715.19401: done getting next task for host managed_node2 26764 1726882715.19403: ^ task is: TASK: Include the task 'el_repo_setup.yml' 26764 1726882715.19404: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882715.19407: getting variables 26764 1726882715.19408: in VariableManager get_vars() 26764 1726882715.19417: Calling all_inventory to load vars for managed_node2 26764 1726882715.19419: Calling groups_inventory to load vars for managed_node2 26764 1726882715.19422: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882715.19426: Calling all_plugins_play to load vars for managed_node2 26764 1726882715.19546: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882715.19550: Calling groups_plugins_play to load vars for managed_node2 26764 1726882715.19806: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882715.20246: done with get_vars() 26764 1726882715.20254: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_reapply_nm.yml:11 Friday 20 September 2024 21:38:35 -0400 (0:00:01.128) 0:00:01.146 ****** 26764 1726882715.20451: entering _queue_task() for managed_node2/include_tasks 26764 1726882715.20453: Creating lock for include_tasks 26764 1726882715.22044: worker is 1 (out of 1 available) 26764 1726882715.22053: exiting _queue_task() for managed_node2/include_tasks 26764 1726882715.22062: done queuing things up, now waiting for results queue to drain 26764 1726882715.22080: waiting for pending results... 26764 1726882715.22129: running TaskExecutor() for managed_node2/TASK: Include the task 'el_repo_setup.yml' 26764 1726882715.22134: in run() - task 0e448fcc-3ce9-9875-c9a3-000000000006 26764 1726882715.22137: variable 'ansible_search_path' from source: unknown 26764 1726882715.22140: calling self._execute() 26764 1726882715.22168: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882715.22172: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882715.22298: variable 'omit' from source: magic vars 26764 1726882715.22392: _execute() done 26764 1726882715.22700: dumping result to json 26764 1726882715.22709: done dumping result, returning 26764 1726882715.22719: done running TaskExecutor() for managed_node2/TASK: Include the task 'el_repo_setup.yml' [0e448fcc-3ce9-9875-c9a3-000000000006] 26764 1726882715.22731: sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000006 26764 1726882715.22943: no more pending results, returning what we have 26764 1726882715.22947: in VariableManager get_vars() 26764 1726882715.22986: Calling all_inventory to load vars for managed_node2 26764 1726882715.22989: Calling groups_inventory to load vars for managed_node2 26764 1726882715.22993: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882715.23008: Calling all_plugins_play to load vars for managed_node2 26764 1726882715.23011: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882715.23014: Calling groups_plugins_play to load vars for managed_node2 26764 1726882715.23228: done sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000006 26764 1726882715.23233: WORKER PROCESS EXITING 26764 1726882715.23279: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882715.23695: done with get_vars() 26764 1726882715.23955: variable 'ansible_search_path' from source: unknown 26764 1726882715.23974: we have included files to process 26764 1726882715.23975: generating all_blocks data 26764 1726882715.23977: done generating all_blocks data 26764 1726882715.23978: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 26764 1726882715.23979: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 26764 1726882715.23982: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 26764 1726882715.26317: in VariableManager get_vars() 26764 1726882715.26476: done with get_vars() 26764 1726882715.26489: done processing included file 26764 1726882715.26491: iterating over new_blocks loaded from include file 26764 1726882715.26493: in VariableManager get_vars() 26764 1726882715.26502: done with get_vars() 26764 1726882715.26504: filtering new block on tags 26764 1726882715.26522: done filtering new block on tags 26764 1726882715.26526: in VariableManager get_vars() 26764 1726882715.26651: done with get_vars() 26764 1726882715.26653: filtering new block on tags 26764 1726882715.26758: done filtering new block on tags 26764 1726882715.26761: in VariableManager get_vars() 26764 1726882715.26777: done with get_vars() 26764 1726882715.26778: filtering new block on tags 26764 1726882715.26810: done filtering new block on tags 26764 1726882715.26813: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed_node2 26764 1726882715.26819: extending task lists for all hosts with included blocks 26764 1726882715.26930: done extending task lists 26764 1726882715.26932: done processing included files 26764 1726882715.26933: results queue empty 26764 1726882715.26933: checking for any_errors_fatal 26764 1726882715.26935: done checking for any_errors_fatal 26764 1726882715.26936: checking for max_fail_percentage 26764 1726882715.26937: done checking for max_fail_percentage 26764 1726882715.26938: checking to see if all hosts have failed and the running result is not ok 26764 1726882715.26938: done checking to see if all hosts have failed 26764 1726882715.26939: getting the remaining hosts for this loop 26764 1726882715.26940: done getting the remaining hosts for this loop 26764 1726882715.26943: getting the next task for host managed_node2 26764 1726882715.26947: done getting next task for host managed_node2 26764 1726882715.26949: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 26764 1726882715.26951: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882715.26953: getting variables 26764 1726882715.26954: in VariableManager get_vars() 26764 1726882715.26962: Calling all_inventory to load vars for managed_node2 26764 1726882715.26968: Calling groups_inventory to load vars for managed_node2 26764 1726882715.26971: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882715.27089: Calling all_plugins_play to load vars for managed_node2 26764 1726882715.27093: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882715.27097: Calling groups_plugins_play to load vars for managed_node2 26764 1726882715.27438: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882715.27880: done with get_vars() 26764 1726882715.27889: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Friday 20 September 2024 21:38:35 -0400 (0:00:00.074) 0:00:01.221 ****** 26764 1726882715.28069: entering _queue_task() for managed_node2/setup 26764 1726882715.28761: worker is 1 (out of 1 available) 26764 1726882715.28780: exiting _queue_task() for managed_node2/setup 26764 1726882715.28793: done queuing things up, now waiting for results queue to drain 26764 1726882715.28794: waiting for pending results... 26764 1726882715.29736: running TaskExecutor() for managed_node2/TASK: Gather the minimum subset of ansible_facts required by the network role test 26764 1726882715.29942: in run() - task 0e448fcc-3ce9-9875-c9a3-000000000048 26764 1726882715.29967: variable 'ansible_search_path' from source: unknown 26764 1726882715.29975: variable 'ansible_search_path' from source: unknown 26764 1726882715.30014: calling self._execute() 26764 1726882715.30194: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882715.30267: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882715.30285: variable 'omit' from source: magic vars 26764 1726882715.31539: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 26764 1726882715.37936: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 26764 1726882715.38332: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 26764 1726882715.38468: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 26764 1726882715.38586: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 26764 1726882715.38689: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 26764 1726882715.38803: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26764 1726882715.38979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26764 1726882715.39011: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26764 1726882715.39099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26764 1726882715.39182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26764 1726882715.39570: variable 'ansible_facts' from source: unknown 26764 1726882715.39770: variable 'network_test_required_facts' from source: task vars 26764 1726882715.39838: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 26764 1726882715.39930: variable 'omit' from source: magic vars 26764 1726882715.39979: variable 'omit' from source: magic vars 26764 1726882715.40016: variable 'omit' from source: magic vars 26764 1726882715.40352: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26764 1726882715.40399: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26764 1726882715.40457: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26764 1726882715.40503: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882715.40560: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882715.40698: variable 'inventory_hostname' from source: host vars for 'managed_node2' 26764 1726882715.40706: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882715.40712: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882715.40939: Set connection var ansible_shell_executable to /bin/sh 26764 1726882715.40946: Set connection var ansible_shell_type to sh 26764 1726882715.40960: Set connection var ansible_module_compression to ZIP_DEFLATED 26764 1726882715.41004: Set connection var ansible_timeout to 10 26764 1726882715.41014: Set connection var ansible_connection to ssh 26764 1726882715.41078: Set connection var ansible_pipelining to False 26764 1726882715.41117: variable 'ansible_shell_executable' from source: unknown 26764 1726882715.41126: variable 'ansible_connection' from source: unknown 26764 1726882715.41223: variable 'ansible_module_compression' from source: unknown 26764 1726882715.41235: variable 'ansible_shell_type' from source: unknown 26764 1726882715.41243: variable 'ansible_shell_executable' from source: unknown 26764 1726882715.41250: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882715.41258: variable 'ansible_pipelining' from source: unknown 26764 1726882715.41270: variable 'ansible_timeout' from source: unknown 26764 1726882715.41278: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882715.41550: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 26764 1726882715.41571: variable 'omit' from source: magic vars 26764 1726882715.41632: starting attempt loop 26764 1726882715.41640: running the handler 26764 1726882715.41670: _low_level_execute_command(): starting 26764 1726882715.41689: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 26764 1726882715.43161: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26764 1726882715.43184: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882715.43206: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882715.43232: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882715.43278: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882715.43296: stderr chunk (state=3): >>>debug2: match not found <<< 26764 1726882715.43319: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882715.43344: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882715.43357: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882715.43372: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26764 1726882715.43386: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882715.43403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882715.43422: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882715.43440: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882715.43451: stderr chunk (state=3): >>>debug2: match found <<< 26764 1726882715.43469: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882715.43556: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882715.43577: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882715.43599: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882715.43733: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882715.45399: stdout chunk (state=3): >>>/root <<< 26764 1726882715.45578: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882715.45581: stdout chunk (state=3): >>><<< 26764 1726882715.45596: stderr chunk (state=3): >>><<< 26764 1726882715.45702: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882715.45705: _low_level_execute_command(): starting 26764 1726882715.45709: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882715.4561317-26808-135731481638701 `" && echo ansible-tmp-1726882715.4561317-26808-135731481638701="` echo /root/.ansible/tmp/ansible-tmp-1726882715.4561317-26808-135731481638701 `" ) && sleep 0' 26764 1726882715.46279: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26764 1726882715.46293: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882715.46306: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882715.46322: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882715.46373: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882715.46385: stderr chunk (state=3): >>>debug2: match not found <<< 26764 1726882715.46526: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882715.46639: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882715.46651: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882715.46662: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26764 1726882715.46683: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882715.46696: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882715.46711: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882715.46722: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882715.46741: stderr chunk (state=3): >>>debug2: match found <<< 26764 1726882715.46755: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882715.46833: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882715.46861: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882715.46883: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882715.47014: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882715.48890: stdout chunk (state=3): >>>ansible-tmp-1726882715.4561317-26808-135731481638701=/root/.ansible/tmp/ansible-tmp-1726882715.4561317-26808-135731481638701 <<< 26764 1726882715.49099: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882715.49104: stdout chunk (state=3): >>><<< 26764 1726882715.49107: stderr chunk (state=3): >>><<< 26764 1726882715.49280: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882715.4561317-26808-135731481638701=/root/.ansible/tmp/ansible-tmp-1726882715.4561317-26808-135731481638701 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882715.49284: variable 'ansible_module_compression' from source: unknown 26764 1726882715.49286: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-26764trh16hvb/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 26764 1726882715.49390: variable 'ansible_facts' from source: unknown 26764 1726882715.49521: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882715.4561317-26808-135731481638701/AnsiballZ_setup.py 26764 1726882715.49691: Sending initial data 26764 1726882715.49694: Sent initial data (154 bytes) 26764 1726882715.50755: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26764 1726882715.50773: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882715.50787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882715.50822: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882715.50861: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882715.50879: stderr chunk (state=3): >>>debug2: match not found <<< 26764 1726882715.50906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882715.50947: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882715.50959: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882715.50976: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26764 1726882715.50989: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882715.51003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882715.51024: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882715.51046: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882715.51057: stderr chunk (state=3): >>>debug2: match found <<< 26764 1726882715.51076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882715.51161: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882715.51183: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882715.51196: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882715.51326: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882715.53141: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 26764 1726882715.53240: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 26764 1726882715.53347: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-26764trh16hvb/tmpzai5v5sx /root/.ansible/tmp/ansible-tmp-1726882715.4561317-26808-135731481638701/AnsiballZ_setup.py <<< 26764 1726882715.53444: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 26764 1726882715.56786: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882715.57045: stderr chunk (state=3): >>><<< 26764 1726882715.57048: stdout chunk (state=3): >>><<< 26764 1726882715.57050: done transferring module to remote 26764 1726882715.57052: _low_level_execute_command(): starting 26764 1726882715.57055: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882715.4561317-26808-135731481638701/ /root/.ansible/tmp/ansible-tmp-1726882715.4561317-26808-135731481638701/AnsiballZ_setup.py && sleep 0' 26764 1726882715.57595: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26764 1726882715.57608: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882715.57622: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882715.57639: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882715.57684: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882715.57699: stderr chunk (state=3): >>>debug2: match not found <<< 26764 1726882715.57717: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882715.57736: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882715.57748: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882715.57759: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26764 1726882715.57778: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882715.57792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882715.57809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882715.57824: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882715.57835: stderr chunk (state=3): >>>debug2: match found <<< 26764 1726882715.57848: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882715.57929: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882715.57950: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882715.57969: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882715.58098: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882715.59938: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882715.60013: stderr chunk (state=3): >>><<< 26764 1726882715.60026: stdout chunk (state=3): >>><<< 26764 1726882715.60077: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882715.60083: _low_level_execute_command(): starting 26764 1726882715.60096: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882715.4561317-26808-135731481638701/AnsiballZ_setup.py && sleep 0' 26764 1726882715.61444: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26764 1726882715.61458: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882715.61485: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882715.61512: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882715.61552: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882715.61568: stderr chunk (state=3): >>>debug2: match not found <<< 26764 1726882715.61579: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882715.61597: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882715.61610: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882715.61620: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26764 1726882715.61628: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882715.61638: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882715.61649: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882715.61656: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882715.61663: stderr chunk (state=3): >>>debug2: match found <<< 26764 1726882715.61677: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882715.61785: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882715.61846: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882715.61862: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882715.62022: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882715.64037: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # <<< 26764 1726882715.64106: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 26764 1726882715.64206: stdout chunk (state=3): >>>import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook <<< 26764 1726882715.64295: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 26764 1726882715.64317: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # <<< 26764 1726882715.64434: stdout chunk (state=3): >>>import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4fd6cdc0> <<< 26764 1726882715.64523: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4fd113a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4fd6cb20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4fd6cac0> <<< 26764 1726882715.64567: stdout chunk (state=3): >>>import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4fd11490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4fd11940> <<< 26764 1726882715.64836: stdout chunk (state=3): >>>import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4fd11670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4fcc8190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' <<< 26764 1726882715.64891: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4fcc8220> <<< 26764 1726882715.65025: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4fceb850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4fcc8940> <<< 26764 1726882715.65039: stdout chunk (state=3): >>>import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4fd29880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4fcc1d90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4fcebd90> <<< 26764 1726882715.65100: stdout chunk (state=3): >>>import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4fd11970> <<< 26764 1726882715.65113: stdout chunk (state=3): >>>Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 26764 1726882715.65450: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' <<< 26764 1726882715.65489: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' <<< 26764 1726882715.65515: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py <<< 26764 1726882715.65535: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' <<< 26764 1726882715.65579: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' <<< 26764 1726882715.65582: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4fc67eb0> <<< 26764 1726882715.65625: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4fc6af40> <<< 26764 1726882715.65657: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py <<< 26764 1726882715.65661: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' <<< 26764 1726882715.65691: stdout chunk (state=3): >>>import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py <<< 26764 1726882715.65732: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' <<< 26764 1726882715.65737: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' <<< 26764 1726882715.65761: stdout chunk (state=3): >>>import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4fc60610> <<< 26764 1726882715.65784: stdout chunk (state=3): >>>import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4fc66640> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4fc67370> <<< 26764 1726882715.65797: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py <<< 26764 1726882715.65863: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' <<< 26764 1726882715.65891: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py <<< 26764 1726882715.65914: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' <<< 26764 1726882715.66037: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d4f94ce20> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f94c910> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f94cf10> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py <<< 26764 1726882715.66069: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' <<< 26764 1726882715.66089: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f94cfd0> <<< 26764 1726882715.66119: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f95f0d0> <<< 26764 1726882715.66131: stdout chunk (state=3): >>>import '_collections' # <<< 26764 1726882715.66189: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4fc42d90> <<< 26764 1726882715.66192: stdout chunk (state=3): >>>import '_functools' # <<< 26764 1726882715.66210: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4fc3b670> <<< 26764 1726882715.66289: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4fc4e6d0> <<< 26764 1726882715.66292: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4fc6ee20> <<< 26764 1726882715.66317: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' <<< 26764 1726882715.66336: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d4f95fcd0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4fc422b0> <<< 26764 1726882715.66375: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' <<< 26764 1726882715.66395: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d4fc4e2e0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4fc749d0> <<< 26764 1726882715.66425: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' <<< 26764 1726882715.66445: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' <<< 26764 1726882715.66477: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py <<< 26764 1726882715.66487: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f95feb0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f95fdf0> <<< 26764 1726882715.66515: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py <<< 26764 1726882715.66537: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f95fd60> <<< 26764 1726882715.66560: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py <<< 26764 1726882715.66580: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' <<< 26764 1726882715.66606: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py <<< 26764 1726882715.66653: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' <<< 26764 1726882715.66684: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f9323d0> <<< 26764 1726882715.66712: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py <<< 26764 1726882715.66723: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' <<< 26764 1726882715.66746: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f9324c0> <<< 26764 1726882715.66875: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f966f40> <<< 26764 1726882715.66913: stdout chunk (state=3): >>>import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f961a90> <<< 26764 1726882715.66937: stdout chunk (state=3): >>>import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f961490> <<< 26764 1726882715.66950: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' <<< 26764 1726882715.66988: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py <<< 26764 1726882715.67011: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' <<< 26764 1726882715.67032: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f84c220> <<< 26764 1726882715.67062: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f91d520> <<< 26764 1726882715.67117: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f961f10> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4fc74040> <<< 26764 1726882715.67140: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py <<< 26764 1726882715.67159: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' <<< 26764 1726882715.67197: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py <<< 26764 1726882715.67237: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f85eb50> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d4f85ee80> <<< 26764 1726882715.67265: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' <<< 26764 1726882715.67297: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' <<< 26764 1726882715.67315: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f86f790> <<< 26764 1726882715.67327: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py <<< 26764 1726882715.67359: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' <<< 26764 1726882715.67388: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f86fcd0> <<< 26764 1726882715.67432: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d4f7fd400> <<< 26764 1726882715.67458: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f85ef70> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py <<< 26764 1726882715.67473: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' <<< 26764 1726882715.67509: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d4f80e2e0> <<< 26764 1726882715.67529: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f86f610> import 'pwd' # <<< 26764 1726882715.67554: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d4f80e3a0> <<< 26764 1726882715.67589: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f95fa30> <<< 26764 1726882715.67616: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py <<< 26764 1726882715.67628: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' <<< 26764 1726882715.67653: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py <<< 26764 1726882715.67720: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d4f829700> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' <<< 26764 1726882715.67782: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d4f8299d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f8297c0> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d4f8298b0> <<< 26764 1726882715.68085: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d4f829d00> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d4f834250> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f829940> <<< 26764 1726882715.68109: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f81da90> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f95f610> <<< 26764 1726882715.68268: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py <<< 26764 1726882715.68298: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f829af0> <<< 26764 1726882715.68453: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f4d4f7456d0> <<< 26764 1726882715.68639: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip' # zipimport: zlib available <<< 26764 1726882715.68811: stdout chunk (state=3): >>># zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available <<< 26764 1726882715.69999: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.70962: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f163820> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' <<< 26764 1726882715.71048: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d4f163160> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f163280> <<< 26764 1726882715.71096: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f163f70> <<< 26764 1726882715.71110: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' <<< 26764 1726882715.71147: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f1634f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f163d90> import 'atexit' # <<< 26764 1726882715.71202: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d4f163fd0> <<< 26764 1726882715.71205: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py <<< 26764 1726882715.71345: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f163100> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' <<< 26764 1726882715.71358: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' <<< 26764 1726882715.71445: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f13a0d0> <<< 26764 1726882715.71483: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d4f03f310> <<< 26764 1726882715.71519: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d4f03f160> <<< 26764 1726882715.71532: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' <<< 26764 1726882715.71576: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f03fca0> <<< 26764 1726882715.71590: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f14adc0> <<< 26764 1726882715.71755: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f14a3a0> <<< 26764 1726882715.71776: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' <<< 26764 1726882715.71800: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f14afd0> <<< 26764 1726882715.71814: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' <<< 26764 1726882715.71854: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' <<< 26764 1726882715.71894: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py <<< 26764 1726882715.71911: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' <<< 26764 1726882715.71914: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f19ad30> <<< 26764 1726882715.72034: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f145d30> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f145400> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f118b20> <<< 26764 1726882715.72040: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d4f145520> <<< 26764 1726882715.72071: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f145550> <<< 26764 1726882715.72085: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py <<< 26764 1726882715.72099: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py <<< 26764 1726882715.72133: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' <<< 26764 1726882715.72212: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' <<< 26764 1726882715.72222: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d4f0aafd0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f1ac250> <<< 26764 1726882715.72235: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' <<< 26764 1726882715.72290: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d4f0a7850> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f1ac3d0> <<< 26764 1726882715.72309: stdout chunk (state=3): >>># /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py <<< 26764 1726882715.72351: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' <<< 26764 1726882715.72382: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # <<< 26764 1726882715.72434: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f1acca0> <<< 26764 1726882715.72590: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f0a77f0> <<< 26764 1726882715.72655: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d4f144c10> <<< 26764 1726882715.72767: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d4f1acfa0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d4f1ac550> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f1a4910> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' <<< 26764 1726882715.72787: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' <<< 26764 1726882715.72905: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d4f09d940> <<< 26764 1726882715.73066: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d4f0bbd90> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f0a6580> <<< 26764 1726882715.73108: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d4f09dee0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f0a69a0> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/compat/__init__.py <<< 26764 1726882715.73121: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.73179: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.73248: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.73349: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available <<< 26764 1726882715.73361: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available <<< 26764 1726882715.73410: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.73532: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.73944: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.74427: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/common/text/converters.py <<< 26764 1726882715.74454: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' <<< 26764 1726882715.74483: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d4f0b77f0> <<< 26764 1726882715.74572: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py <<< 26764 1726882715.74644: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f0f58b0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4ec2d970> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/compat/selinux.py <<< 26764 1726882715.74675: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 26764 1726882715.74678: stdout chunk (state=3): >>>import ansible.module_utils._text # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available <<< 26764 1726882715.74784: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.74911: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' <<< 26764 1726882715.74932: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f120730> # zipimport: zlib available <<< 26764 1726882715.75317: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.75708: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.75729: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.75802: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available <<< 26764 1726882715.75883: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/common/warnings.py <<< 26764 1726882715.75886: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.75923: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.76027: stdout chunk (state=3): >>>import ansible.module_utils.errors # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/errors.py <<< 26764 1726882715.76067: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available <<< 26764 1726882715.76119: stdout chunk (state=3): >>>import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/parsing/convert_bool.py <<< 26764 1726882715.76129: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.76290: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.76511: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py <<< 26764 1726882715.76514: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # <<< 26764 1726882715.76583: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f166370> # zipimport: zlib available <<< 26764 1726882715.76644: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.76729: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/common/text/formatters.py <<< 26764 1726882715.76744: stdout chunk (state=3): >>>import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/common/arg_spec.py <<< 26764 1726882715.76747: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.76776: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.76823: stdout chunk (state=3): >>>import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/common/locale.py <<< 26764 1726882715.76826: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.76880: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.76909: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.76998: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.77038: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py <<< 26764 1726882715.77068: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' <<< 26764 1726882715.77139: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d4f0d9550> <<< 26764 1726882715.77227: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4eaa8eb0> <<< 26764 1726882715.77273: stdout chunk (state=3): >>>import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available <<< 26764 1726882715.77317: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.77374: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.77398: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.77450: stdout chunk (state=3): >>># /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py <<< 26764 1726882715.77476: stdout chunk (state=3): >>># code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py <<< 26764 1726882715.77516: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' <<< 26764 1726882715.77531: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py <<< 26764 1726882715.77543: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' <<< 26764 1726882715.77631: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f0d5820> <<< 26764 1726882715.77775: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f0dd790> <<< 26764 1726882715.77806: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f0d9b50> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/common/sys_info.py <<< 26764 1726882715.77885: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available <<< 26764 1726882715.77896: stdout chunk (state=3): >>># zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/modules/__init__.py # zipimport: zlib available <<< 26764 1726882715.77941: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.78013: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.78061: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.78097: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 26764 1726882715.78147: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.78190: stdout chunk (state=3): >>>import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/namespace.py # zipimport: zlib available <<< 26764 1726882715.78229: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.78334: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.78354: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.78387: stdout chunk (state=3): >>>import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available <<< 26764 1726882715.78500: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.78623: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.78658: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.78695: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' <<< 26764 1726882715.78727: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' <<< 26764 1726882715.78763: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' <<< 26764 1726882715.78785: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4ebf2370> <<< 26764 1726882715.78822: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' <<< 26764 1726882715.78856: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py # code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' <<< 26764 1726882715.78898: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py <<< 26764 1726882715.78910: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4ec0f580> <<< 26764 1726882715.78951: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d4ec0f4f0> <<< 26764 1726882715.79005: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4ebe2280> <<< 26764 1726882715.79018: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4ebf2970> <<< 26764 1726882715.79073: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4e9a97f0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4e9a9b20> # /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py <<< 26764 1726882715.79104: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' <<< 26764 1726882715.79148: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d4ec530a0> <<< 26764 1726882715.79179: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4ebf1f70> <<< 26764 1726882715.79199: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' <<< 26764 1726882715.79219: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4ec53190> <<< 26764 1726882715.79237: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py <<< 26764 1726882715.79249: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' <<< 26764 1726882715.79280: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d4ea12fd0> <<< 26764 1726882715.79317: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4ec3e820> <<< 26764 1726882715.79343: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4e9a9d60> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/timeout.py import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/collector.py <<< 26764 1726882715.79365: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 26764 1726882715.79389: stdout chunk (state=3): >>>import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/other/__init__.py <<< 26764 1726882715.79416: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.79459: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.79494: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/other/facter.py <<< 26764 1726882715.79506: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.79535: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.79597: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/other/ohai.py # zipimport: zlib available <<< 26764 1726882715.79616: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/system/__init__.py # zipimport: zlib available <<< 26764 1726882715.79644: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.79673: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/system/apparmor.py <<< 26764 1726882715.79684: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.79719: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.79763: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/system/caps.py <<< 26764 1726882715.79776: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.79838: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.79867: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/system/chroot.py # zipimport: zlib available <<< 26764 1726882715.79916: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.79994: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.80063: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/system/cmdline.py # zipimport: zlib available <<< 26764 1726882715.80489: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.80806: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available <<< 26764 1726882715.80853: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.80900: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.80930: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.81000: stdout chunk (state=3): >>>import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/system/date_time.py <<< 26764 1726882715.81032: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.81035: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/system/env.py # zipimport: zlib available <<< 26764 1726882715.81086: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.81145: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/system/dns.py <<< 26764 1726882715.81180: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.81194: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available <<< 26764 1726882715.81262: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.81267: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/system/loadavg.py # zipimport: zlib available <<< 26764 1726882715.81315: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.81422: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4e8fce80> <<< 26764 1726882715.81443: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py <<< 26764 1726882715.81456: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' <<< 26764 1726882715.81616: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4e8fc9d0> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/system/local.py # zipimport: zlib available <<< 26764 1726882715.81672: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.81732: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/system/lsb.py # zipimport: zlib available <<< 26764 1726882715.81813: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.81898: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py <<< 26764 1726882715.81901: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.81945: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.82015: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/system/platform.py # zipimport: zlib available <<< 26764 1726882715.82060: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.82097: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py <<< 26764 1726882715.82124: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' <<< 26764 1726882715.82279: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d4e972490> <<< 26764 1726882715.82512: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4e90b850> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/system/python.py <<< 26764 1726882715.82515: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.82600: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.82615: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/system/selinux.py # zipimport: zlib available <<< 26764 1726882715.82679: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.82755: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.82847: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.82986: stdout chunk (state=3): >>>import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py <<< 26764 1726882715.82989: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.83015: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.83062: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py # zipimport: zlib available <<< 26764 1726882715.83086: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.83140: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' <<< 26764 1726882715.83295: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d4e970670> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4e970220> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py # zipimport: zlib available <<< 26764 1726882715.83306: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.83435: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/base.py # zipimport: zlib available # zipimport: zlib available <<< 26764 1726882715.83554: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/aix.py # zipimport: zlib available <<< 26764 1726882715.83642: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.83748: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.83811: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py <<< 26764 1726882715.83814: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.83896: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.83924: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.84028: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.84158: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py # zipimport: zlib available <<< 26764 1726882715.84295: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.84381: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py # zipimport: zlib available # zipimport: zlib available <<< 26764 1726882715.84419: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.84840: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.85257: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py <<< 26764 1726882715.85261: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.85342: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.85433: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py # zipimport: zlib available <<< 26764 1726882715.85524: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.85597: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py <<< 26764 1726882715.85614: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.85755: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.85895: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py # zipimport: zlib available <<< 26764 1726882715.85917: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/network/__init__.py # zipimport: zlib available <<< 26764 1726882715.85932: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.85977: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/network/base.py <<< 26764 1726882715.85980: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.86071: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.86134: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.86303: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.86486: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/network/aix.py <<< 26764 1726882715.86489: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.86511: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.86543: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/network/darwin.py <<< 26764 1726882715.86572: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.86626: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.86679: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available <<< 26764 1726882715.86779: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.86782: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available <<< 26764 1726882715.86824: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.86923: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/network/hpux.py # zipimport: zlib available <<< 26764 1726882715.86935: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.86984: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/network/hurd.py # zipimport: zlib available <<< 26764 1726882715.87199: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.87422: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/network/linux.py <<< 26764 1726882715.87425: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.87465: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.87526: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/network/iscsi.py # zipimport: zlib available <<< 26764 1726882715.87642: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.87649: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/network/nvme.py # zipimport: zlib available # zipimport: zlib available <<< 26764 1726882715.87731: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/network/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/network/openbsd.py # zipimport: zlib available <<< 26764 1726882715.87796: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.87887: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/network/sunos.py # zipimport: zlib available # zipimport: zlib available <<< 26764 1726882715.87900: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py # zipimport: zlib available <<< 26764 1726882715.87937: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.87993: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/base.py <<< 26764 1726882715.88012: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.88054: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 26764 1726882715.88092: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.88167: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.88227: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py <<< 26764 1726882715.88248: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.88276: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.88329: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py <<< 26764 1726882715.88332: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.88499: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.88665: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/linux.py <<< 26764 1726882715.88688: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.88721: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.88770: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py <<< 26764 1726882715.88773: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.88845: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.88869: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py # zipimport: zlib available <<< 26764 1726882715.88919: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.89007: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/default_collectors.py <<< 26764 1726882715.89010: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.89078: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.89161: stdout chunk (state=3): >>>import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/compat.py <<< 26764 1726882715.89166: stdout chunk (state=3): >>>import ansible.module_utils.facts # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/__init__.py <<< 26764 1726882715.89230: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882715.89423: stdout chunk (state=3): >>>import 'gc' # <<< 26764 1726882715.89736: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' <<< 26764 1726882715.89777: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py <<< 26764 1726882715.89780: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' <<< 26764 1726882715.89812: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d4e8b4a90> <<< 26764 1726882715.89826: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4e909af0> <<< 26764 1726882715.89886: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4e909c10> <<< 26764 1726882715.91882: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-158", "ansible_nodename": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "21e18164a0c64d0daed004bd8a1b67b7", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_fips": false, "ansible_apparmor": {"status": "disabled"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 33528 10.31.11.158 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 33528 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_de<<< 26764 1726882715.91900: stdout chunk (state=3): >>>clare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALEARW5ZJ51XTLSDuUsPojumVU0f1DmiQsXjMOap4QLlljOiysapjSUe6pZOyAdiI/KfARhDoOFvlC07kCLCcs7DDk8JxBZpsM0D55SdDlfwsB3FVgWNP+9by8G6kzbePHWdZyyWlAuavj4OAEwAjpWpP8/daus0ha4xywlVVoKjAAAAFQCbiW4bR+tgMvjrxC198dqI1mTbjQAAAIBzCzkJTtnGDKfOHq2dFI5cUEuaj1PgRot3wyaXENzUjZVnIFgXUmgKDCxO+EAtU6uAkBPQF4XNgiuaw5bavYpZxcJ4WIpM4ZDRoSkc7BBbJPRLZ45GfrHJwgqAmAZ3RSvVqeXE4WKQHLm43/eDHewgPqqqWe6QVuQH5SEe79yk3wAAAIEArG+AuupiAeoVJ9Lh36QMj4kRo5pTASh2eD5MqSOdy39UhsXbWBcj3JCIvNk/nwep/9neGyRZ5t5wT05dRX80vlgZJX65hrbepO+lqC3wlng+6GQ34D7TJKYnvEkR3neE0+06kx5R6IRWZf1YQV6fMQhx8AJ2JmvnLFicmYlkhQQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDND+RJCrYgIUzolo5fZ64Ey6cksefKDUWmGDjsqVTmuT3HrlDyUZOro4JAnUQBmiamXsJUFbrFdJAVpukD4yyowqCQLr0ZFuKNEzrt5CObrtWflOskKynO3kaoU0WhDkqIbwS2j/+NxBCxgDGqd/5Os3cOMv3eyjUElz6xoI4zsmGMfxVYmT+/SHBfoyxyqY8Hw2Ooq+H5L9OlYgV4hqu7kKPpM1THUJTjy47m6qvws5gztclLjPA1KIW2Dz6kKzUYspNJcoS2sK1xFvL7mBjpGAP7WhXVH2n5ySenQ24Z6mEj+tG2f11rjPpjCUjDzzciGCWiRDZWBLm/GGmQXJJ8zAYnw82yIUKqufLrr1wmcXICPMVj9pFjXSoBWe/yhX9E87w7YD5HWsUrgrLdSctdV4QYy+R5g9ERi7FjwbRsuZ04BihZs70+f/29hUzuc6MA87KVovGT0Uc7GVC7bx8NLt0bTBsbydlONVHVQuol/YEpQrQophDvmBfh+PgMDH8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOEITn1vyppR+Moe1UdR0WGPhUnQ/dwHNcNi0OYy21LkBQ5jsxOPLvZ+C2MbRYlz2afs4nYYIV8E0AuK6aRks3w=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKEdFOHVk9tX1R+zEyLVdxS/U5QeeeFYWSnUmjpXlpt7", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_lsb": {}, "ansible_local": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "38", "second": "35", "epoch": "1726882715", "epoch_int": "1726882715", "date": "2024-09-20", "time": "21:38:35", "iso8601_micro": "2024-09-21T01:38:35.911172Z", "iso8601": "2024-09-21T01:38:35Z", "iso8601_basic": "20240920T213835911172", "iso8601_basic_short": "20240920T213835", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 26764 1726882715.92386: stdout chunk (state=3): >>># clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache <<< 26764 1726882715.92434: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct <<< 26764 1726882715.92513: stdout chunk (state=3): >>># cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors <<< 26764 1726882715.92578: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix <<< 26764 1726882715.92588: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing gc # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 26764 1726882715.92860: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 26764 1726882715.92871: stdout chunk (state=3): >>># destroy importlib.util # destroy importlib.abc # destroy importlib.machinery <<< 26764 1726882715.92906: stdout chunk (state=3): >>># destroy zipimport # destroy _compression <<< 26764 1726882715.92909: stdout chunk (state=3): >>># destroy binascii # destroy importlib # destroy bz2 # destroy lzma <<< 26764 1726882715.92968: stdout chunk (state=3): >>># destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner <<< 26764 1726882715.92971: stdout chunk (state=3): >>># destroy _json # destroy encodings <<< 26764 1726882715.92983: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 26764 1726882715.93017: stdout chunk (state=3): >>># destroy selinux # destroy distro # destroy logging # destroy argparse <<< 26764 1726882715.93092: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy pickle # destroy multiprocessing.context # destroy array <<< 26764 1726882715.93114: stdout chunk (state=3): >>># destroy _compat_pickle <<< 26764 1726882715.93138: stdout chunk (state=3): >>># destroy queue # destroy multiprocessing.process # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction <<< 26764 1726882715.93172: stdout chunk (state=3): >>># destroy shlex # destroy datetime <<< 26764 1726882715.93198: stdout chunk (state=3): >>># destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass <<< 26764 1726882715.93217: stdout chunk (state=3): >>># destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector <<< 26764 1726882715.93271: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping gc # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux <<< 26764 1726882715.93326: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess <<< 26764 1726882715.93387: stdout chunk (state=3): >>># cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings <<< 26764 1726882715.93450: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types <<< 26764 1726882715.93523: stdout chunk (state=3): >>># cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix <<< 26764 1726882715.93543: stdout chunk (state=3): >>># cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins <<< 26764 1726882715.93577: stdout chunk (state=3): >>># destroy unicodedata # destroy gc # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket <<< 26764 1726882715.93580: stdout chunk (state=3): >>># destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal <<< 26764 1726882715.93758: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy _sre # destroy sre_parse <<< 26764 1726882715.93791: stdout chunk (state=3): >>># destroy tokenize <<< 26764 1726882715.93822: stdout chunk (state=3): >>># destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors <<< 26764 1726882715.93843: stdout chunk (state=3): >>># destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator <<< 26764 1726882715.93845: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal <<< 26764 1726882715.93875: stdout chunk (state=3): >>># destroy _frozen_importlib # clear sys.audit hooks <<< 26764 1726882715.94157: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 26764 1726882715.94242: stderr chunk (state=3): >>><<< 26764 1726882715.94245: stdout chunk (state=3): >>><<< 26764 1726882715.94479: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4fd6cdc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4fd113a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4fd6cb20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4fd6cac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4fd11490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4fd11940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4fd11670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4fcc8190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4fcc8220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4fceb850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4fcc8940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4fd29880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4fcc1d90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4fcebd90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4fd11970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4fc67eb0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4fc6af40> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4fc60610> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4fc66640> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4fc67370> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d4f94ce20> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f94c910> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f94cf10> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f94cfd0> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f95f0d0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4fc42d90> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4fc3b670> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4fc4e6d0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4fc6ee20> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d4f95fcd0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4fc422b0> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d4fc4e2e0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4fc749d0> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f95feb0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f95fdf0> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f95fd60> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f9323d0> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f9324c0> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f966f40> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f961a90> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f961490> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f84c220> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f91d520> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f961f10> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4fc74040> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f85eb50> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d4f85ee80> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f86f790> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f86fcd0> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d4f7fd400> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f85ef70> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d4f80e2e0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f86f610> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d4f80e3a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f95fa30> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d4f829700> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d4f8299d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f8297c0> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d4f8298b0> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d4f829d00> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d4f834250> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f829940> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f81da90> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f95f610> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f829af0> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f4d4f7456d0> # zipimport: found 103 names in '/tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f163820> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d4f163160> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f163280> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f163f70> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f1634f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f163d90> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d4f163fd0> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f163100> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f13a0d0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d4f03f310> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d4f03f160> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f03fca0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f14adc0> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f14a3a0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f14afd0> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f19ad30> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f145d30> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f145400> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f118b20> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d4f145520> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f145550> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d4f0aafd0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f1ac250> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d4f0a7850> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f1ac3d0> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f1acca0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f0a77f0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d4f144c10> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d4f1acfa0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d4f1ac550> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f1a4910> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d4f09d940> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d4f0bbd90> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f0a6580> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d4f09dee0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f0a69a0> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d4f0b77f0> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f0f58b0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4ec2d970> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f120730> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f166370> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d4f0d9550> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4eaa8eb0> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f0d5820> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f0dd790> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4f0d9b50> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/namespace.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4ebf2370> # /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py # code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py # code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4ec0f580> # extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d4ec0f4f0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4ebe2280> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4ebf2970> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4e9a97f0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4e9a9b20> # /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' # extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d4ec530a0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4ebf1f70> # /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4ec53190> # /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d4ea12fd0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4ec3e820> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4e9a9d60> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/timeout.py import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/collector.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/other/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/other/facter.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/other/ohai.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/system/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/system/apparmor.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/system/caps.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/system/chroot.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/system/cmdline.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/system/date_time.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/system/env.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/system/dns.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/system/loadavg.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4e8fce80> # /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py # code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4e8fc9d0> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/system/local.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/system/lsb.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/system/platform.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py # code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d4e972490> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4e90b850> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/system/python.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/system/selinux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d4e970670> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4e970220> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/base.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/aix.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/network/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/network/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/network/aix.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/network/darwin.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/network/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/network/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/network/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/network/iscsi.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/network/nvme.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/network/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/network/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/network/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/default_collectors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_setup_payload_upf2ahka/ansible_setup_payload.zip/ansible/module_utils/facts/__init__.py # zipimport: zlib available import 'gc' # # /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py # code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4d4e8b4a90> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4e909af0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4d4e909c10> {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-158", "ansible_nodename": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "21e18164a0c64d0daed004bd8a1b67b7", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_fips": false, "ansible_apparmor": {"status": "disabled"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 33528 10.31.11.158 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 33528 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALEARW5ZJ51XTLSDuUsPojumVU0f1DmiQsXjMOap4QLlljOiysapjSUe6pZOyAdiI/KfARhDoOFvlC07kCLCcs7DDk8JxBZpsM0D55SdDlfwsB3FVgWNP+9by8G6kzbePHWdZyyWlAuavj4OAEwAjpWpP8/daus0ha4xywlVVoKjAAAAFQCbiW4bR+tgMvjrxC198dqI1mTbjQAAAIBzCzkJTtnGDKfOHq2dFI5cUEuaj1PgRot3wyaXENzUjZVnIFgXUmgKDCxO+EAtU6uAkBPQF4XNgiuaw5bavYpZxcJ4WIpM4ZDRoSkc7BBbJPRLZ45GfrHJwgqAmAZ3RSvVqeXE4WKQHLm43/eDHewgPqqqWe6QVuQH5SEe79yk3wAAAIEArG+AuupiAeoVJ9Lh36QMj4kRo5pTASh2eD5MqSOdy39UhsXbWBcj3JCIvNk/nwep/9neGyRZ5t5wT05dRX80vlgZJX65hrbepO+lqC3wlng+6GQ34D7TJKYnvEkR3neE0+06kx5R6IRWZf1YQV6fMQhx8AJ2JmvnLFicmYlkhQQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDND+RJCrYgIUzolo5fZ64Ey6cksefKDUWmGDjsqVTmuT3HrlDyUZOro4JAnUQBmiamXsJUFbrFdJAVpukD4yyowqCQLr0ZFuKNEzrt5CObrtWflOskKynO3kaoU0WhDkqIbwS2j/+NxBCxgDGqd/5Os3cOMv3eyjUElz6xoI4zsmGMfxVYmT+/SHBfoyxyqY8Hw2Ooq+H5L9OlYgV4hqu7kKPpM1THUJTjy47m6qvws5gztclLjPA1KIW2Dz6kKzUYspNJcoS2sK1xFvL7mBjpGAP7WhXVH2n5ySenQ24Z6mEj+tG2f11rjPpjCUjDzzciGCWiRDZWBLm/GGmQXJJ8zAYnw82yIUKqufLrr1wmcXICPMVj9pFjXSoBWe/yhX9E87w7YD5HWsUrgrLdSctdV4QYy+R5g9ERi7FjwbRsuZ04BihZs70+f/29hUzuc6MA87KVovGT0Uc7GVC7bx8NLt0bTBsbydlONVHVQuol/YEpQrQophDvmBfh+PgMDH8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOEITn1vyppR+Moe1UdR0WGPhUnQ/dwHNcNi0OYy21LkBQ5jsxOPLvZ+C2MbRYlz2afs4nYYIV8E0AuK6aRks3w=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKEdFOHVk9tX1R+zEyLVdxS/U5QeeeFYWSnUmjpXlpt7", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_lsb": {}, "ansible_local": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "38", "second": "35", "epoch": "1726882715", "epoch_int": "1726882715", "date": "2024-09-20", "time": "21:38:35", "iso8601_micro": "2024-09-21T01:38:35.911172Z", "iso8601": "2024-09-21T01:38:35Z", "iso8601_basic": "20240920T213835911172", "iso8601_basic_short": "20240920T213835", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing gc # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy queue # destroy multiprocessing.process # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping gc # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy unicodedata # destroy gc # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. [WARNING]: Module invocation had junk after the JSON data: # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing gc # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy queue # destroy multiprocessing.process # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping gc # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy unicodedata # destroy gc # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks 26764 1726882715.95589: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882715.4561317-26808-135731481638701/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 26764 1726882715.95592: _low_level_execute_command(): starting 26764 1726882715.95595: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882715.4561317-26808-135731481638701/ > /dev/null 2>&1 && sleep 0' 26764 1726882715.96970: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26764 1726882715.97026: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882715.97040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882715.97063: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882715.97105: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882715.97129: stderr chunk (state=3): >>>debug2: match not found <<< 26764 1726882715.97141: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882715.97159: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882715.97240: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882715.97250: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26764 1726882715.97261: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882715.97280: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882715.97295: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882715.97306: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882715.97315: stderr chunk (state=3): >>>debug2: match found <<< 26764 1726882715.97326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882715.97473: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882715.97499: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882715.97515: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882715.97646: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882715.99535: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882715.99539: stdout chunk (state=3): >>><<< 26764 1726882715.99541: stderr chunk (state=3): >>><<< 26764 1726882715.99875: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882715.99879: handler run complete 26764 1726882715.99882: variable 'ansible_facts' from source: unknown 26764 1726882715.99884: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882716.00009: variable 'ansible_facts' from source: unknown 26764 1726882716.00058: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882716.00127: attempt loop complete, returning result 26764 1726882716.00135: _execute() done 26764 1726882716.00142: dumping result to json 26764 1726882716.00158: done dumping result, returning 26764 1726882716.00174: done running TaskExecutor() for managed_node2/TASK: Gather the minimum subset of ansible_facts required by the network role test [0e448fcc-3ce9-9875-c9a3-000000000048] 26764 1726882716.00183: sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000048 ok: [managed_node2] 26764 1726882716.00485: no more pending results, returning what we have 26764 1726882716.00488: results queue empty 26764 1726882716.00489: checking for any_errors_fatal 26764 1726882716.00490: done checking for any_errors_fatal 26764 1726882716.00490: checking for max_fail_percentage 26764 1726882716.00492: done checking for max_fail_percentage 26764 1726882716.00493: checking to see if all hosts have failed and the running result is not ok 26764 1726882716.00494: done checking to see if all hosts have failed 26764 1726882716.00495: getting the remaining hosts for this loop 26764 1726882716.00496: done getting the remaining hosts for this loop 26764 1726882716.00500: getting the next task for host managed_node2 26764 1726882716.00509: done getting next task for host managed_node2 26764 1726882716.00511: ^ task is: TASK: Check if system is ostree 26764 1726882716.00514: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882716.00517: getting variables 26764 1726882716.00519: in VariableManager get_vars() 26764 1726882716.00546: Calling all_inventory to load vars for managed_node2 26764 1726882716.00548: Calling groups_inventory to load vars for managed_node2 26764 1726882716.00551: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882716.00561: Calling all_plugins_play to load vars for managed_node2 26764 1726882716.00568: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882716.00571: Calling groups_plugins_play to load vars for managed_node2 26764 1726882716.00749: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882716.00958: done with get_vars() 26764 1726882716.00971: done getting variables 26764 1726882716.01115: done sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000048 26764 1726882716.01118: WORKER PROCESS EXITING TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Friday 20 September 2024 21:38:36 -0400 (0:00:00.732) 0:00:01.953 ****** 26764 1726882716.01199: entering _queue_task() for managed_node2/stat 26764 1726882716.02488: worker is 1 (out of 1 available) 26764 1726882716.02500: exiting _queue_task() for managed_node2/stat 26764 1726882716.02511: done queuing things up, now waiting for results queue to drain 26764 1726882716.02512: waiting for pending results... 26764 1726882716.02750: running TaskExecutor() for managed_node2/TASK: Check if system is ostree 26764 1726882716.02851: in run() - task 0e448fcc-3ce9-9875-c9a3-00000000004a 26764 1726882716.02902: variable 'ansible_search_path' from source: unknown 26764 1726882716.02909: variable 'ansible_search_path' from source: unknown 26764 1726882716.02949: calling self._execute() 26764 1726882716.03028: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882716.03038: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882716.03052: variable 'omit' from source: magic vars 26764 1726882716.03630: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 26764 1726882716.04205: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 26764 1726882716.04253: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 26764 1726882716.04407: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 26764 1726882716.04444: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 26764 1726882716.04647: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 26764 1726882716.04680: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 26764 1726882716.04726: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 26764 1726882716.04829: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 26764 1726882716.04983: Evaluated conditional (not __network_is_ostree is defined): True 26764 1726882716.04994: variable 'omit' from source: magic vars 26764 1726882716.05068: variable 'omit' from source: magic vars 26764 1726882716.05107: variable 'omit' from source: magic vars 26764 1726882716.05134: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26764 1726882716.05173: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26764 1726882716.05199: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26764 1726882716.05219: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882716.05242: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882716.05281: variable 'inventory_hostname' from source: host vars for 'managed_node2' 26764 1726882716.05289: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882716.05297: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882716.05398: Set connection var ansible_shell_executable to /bin/sh 26764 1726882716.05406: Set connection var ansible_shell_type to sh 26764 1726882716.05420: Set connection var ansible_module_compression to ZIP_DEFLATED 26764 1726882716.05428: Set connection var ansible_timeout to 10 26764 1726882716.05436: Set connection var ansible_connection to ssh 26764 1726882716.05444: Set connection var ansible_pipelining to False 26764 1726882716.05470: variable 'ansible_shell_executable' from source: unknown 26764 1726882716.05484: variable 'ansible_connection' from source: unknown 26764 1726882716.05491: variable 'ansible_module_compression' from source: unknown 26764 1726882716.05498: variable 'ansible_shell_type' from source: unknown 26764 1726882716.05504: variable 'ansible_shell_executable' from source: unknown 26764 1726882716.05511: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882716.05518: variable 'ansible_pipelining' from source: unknown 26764 1726882716.05524: variable 'ansible_timeout' from source: unknown 26764 1726882716.05531: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882716.05669: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 26764 1726882716.05684: variable 'omit' from source: magic vars 26764 1726882716.05699: starting attempt loop 26764 1726882716.05705: running the handler 26764 1726882716.05722: _low_level_execute_command(): starting 26764 1726882716.05735: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 26764 1726882716.06442: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26764 1726882716.06455: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882716.06478: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882716.06497: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882716.06537: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882716.06548: stderr chunk (state=3): >>>debug2: match not found <<< 26764 1726882716.06561: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882716.06586: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882716.06599: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882716.06608: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26764 1726882716.06620: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882716.06642: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882716.06659: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882716.06673: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882716.06684: stderr chunk (state=3): >>>debug2: match found <<< 26764 1726882716.06704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882716.06782: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882716.06819: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882716.06837: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882716.06965: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882716.08585: stdout chunk (state=3): >>>/root <<< 26764 1726882716.08822: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882716.08893: stderr chunk (state=3): >>><<< 26764 1726882716.09037: stdout chunk (state=3): >>><<< 26764 1726882716.09142: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882716.09152: _low_level_execute_command(): starting 26764 1726882716.09154: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882716.09062-26850-59271690699949 `" && echo ansible-tmp-1726882716.09062-26850-59271690699949="` echo /root/.ansible/tmp/ansible-tmp-1726882716.09062-26850-59271690699949 `" ) && sleep 0' 26764 1726882716.09867: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882716.09871: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882716.10718: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882716.10729: stderr chunk (state=3): >>>debug2: match not found <<< 26764 1726882716.10741: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882716.10757: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882716.10770: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882716.10784: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26764 1726882716.10804: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882716.10817: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882716.10830: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882716.10840: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882716.10850: stderr chunk (state=3): >>>debug2: match found <<< 26764 1726882716.10861: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882716.10948: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882716.10970: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882716.10985: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882716.11118: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882716.12987: stdout chunk (state=3): >>>ansible-tmp-1726882716.09062-26850-59271690699949=/root/.ansible/tmp/ansible-tmp-1726882716.09062-26850-59271690699949 <<< 26764 1726882716.13292: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882716.13361: stderr chunk (state=3): >>><<< 26764 1726882716.13383: stdout chunk (state=3): >>><<< 26764 1726882716.13690: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882716.09062-26850-59271690699949=/root/.ansible/tmp/ansible-tmp-1726882716.09062-26850-59271690699949 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882716.13694: variable 'ansible_module_compression' from source: unknown 26764 1726882716.13697: ANSIBALLZ: Using lock for stat 26764 1726882716.13699: ANSIBALLZ: Acquiring lock 26764 1726882716.13701: ANSIBALLZ: Lock acquired: 140693693674080 26764 1726882716.13703: ANSIBALLZ: Creating module 26764 1726882716.45509: ANSIBALLZ: Writing module into payload 26764 1726882716.45922: ANSIBALLZ: Writing module 26764 1726882716.45945: ANSIBALLZ: Renaming module 26764 1726882716.45949: ANSIBALLZ: Done creating module 26764 1726882716.45970: variable 'ansible_facts' from source: unknown 26764 1726882716.46156: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882716.09062-26850-59271690699949/AnsiballZ_stat.py 26764 1726882716.46872: Sending initial data 26764 1726882716.46875: Sent initial data (150 bytes) 26764 1726882716.49411: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26764 1726882716.49420: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882716.49430: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882716.49445: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882716.49486: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882716.49501: stderr chunk (state=3): >>>debug2: match not found <<< 26764 1726882716.49613: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882716.49627: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882716.49634: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882716.49641: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26764 1726882716.49649: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882716.49658: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882716.49674: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882716.49682: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882716.49689: stderr chunk (state=3): >>>debug2: match found <<< 26764 1726882716.49698: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882716.49779: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882716.49791: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882716.49832: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882716.50052: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882716.51888: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 <<< 26764 1726882716.51891: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 26764 1726882716.51991: stderr chunk (state=3): >>>debug1: Using server download size 261120 <<< 26764 1726882716.52007: stderr chunk (state=3): >>>debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 26764 1726882716.52113: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-26764trh16hvb/tmpllb5b7qc /root/.ansible/tmp/ansible-tmp-1726882716.09062-26850-59271690699949/AnsiballZ_stat.py <<< 26764 1726882716.52199: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 26764 1726882716.53661: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882716.53670: stderr chunk (state=3): >>><<< 26764 1726882716.53673: stdout chunk (state=3): >>><<< 26764 1726882716.53695: done transferring module to remote 26764 1726882716.53714: _low_level_execute_command(): starting 26764 1726882716.53718: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882716.09062-26850-59271690699949/ /root/.ansible/tmp/ansible-tmp-1726882716.09062-26850-59271690699949/AnsiballZ_stat.py && sleep 0' 26764 1726882716.56380: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882716.56470: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882716.56547: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882716.56550: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882716.56676: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882716.58528: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882716.58531: stderr chunk (state=3): >>><<< 26764 1726882716.58539: stdout chunk (state=3): >>><<< 26764 1726882716.58570: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882716.58573: _low_level_execute_command(): starting 26764 1726882716.58578: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882716.09062-26850-59271690699949/AnsiballZ_stat.py && sleep 0' 26764 1726882716.60199: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26764 1726882716.60296: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882716.60299: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882716.60305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882716.60458: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882716.60470: stderr chunk (state=3): >>>debug2: match not found <<< 26764 1726882716.60480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882716.60493: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882716.60501: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882716.60517: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26764 1726882716.60520: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882716.60530: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882716.60605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882716.60613: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882716.60621: stderr chunk (state=3): >>>debug2: match found <<< 26764 1726882716.60632: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882716.60710: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882716.60874: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882716.60918: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882716.61099: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882716.63094: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin <<< 26764 1726882716.63098: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 26764 1726882716.63170: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 26764 1726882716.63204: stdout chunk (state=3): >>>import 'posix' # <<< 26764 1726882716.63228: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 26764 1726882716.63233: stdout chunk (state=3): >>># installing zipimport hook <<< 26764 1726882716.63275: stdout chunk (state=3): >>>import 'time' # <<< 26764 1726882716.63278: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 26764 1726882716.63330: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' <<< 26764 1726882716.63355: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py <<< 26764 1726882716.63374: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' <<< 26764 1726882716.63380: stdout chunk (state=3): >>>import '_codecs' # <<< 26764 1726882716.63399: stdout chunk (state=3): >>>import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a773dc0> <<< 26764 1726882716.63446: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py <<< 26764 1726882716.63450: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' <<< 26764 1726882716.63453: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a7183a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a773b20> <<< 26764 1726882716.63497: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' <<< 26764 1726882716.63514: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a773ac0> <<< 26764 1726882716.63519: stdout chunk (state=3): >>>import '_signal' # <<< 26764 1726882716.63553: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' <<< 26764 1726882716.63560: stdout chunk (state=3): >>>import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a718490> <<< 26764 1726882716.63583: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py <<< 26764 1726882716.63601: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' <<< 26764 1726882716.63620: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' <<< 26764 1726882716.63628: stdout chunk (state=3): >>>import '_abc' # <<< 26764 1726882716.63632: stdout chunk (state=3): >>>import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a718940> <<< 26764 1726882716.63655: stdout chunk (state=3): >>>import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a718670> <<< 26764 1726882716.63690: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py <<< 26764 1726882716.63698: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' <<< 26764 1726882716.63716: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py <<< 26764 1726882716.63746: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' <<< 26764 1726882716.63753: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py <<< 26764 1726882716.63793: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' <<< 26764 1726882716.63797: stdout chunk (state=3): >>>import '_stat' # <<< 26764 1726882716.63801: stdout chunk (state=3): >>>import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a6cf190> <<< 26764 1726882716.63819: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py <<< 26764 1726882716.63841: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' <<< 26764 1726882716.63917: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a6cf220> <<< 26764 1726882716.63941: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py <<< 26764 1726882716.63946: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' <<< 26764 1726882716.63974: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' <<< 26764 1726882716.63978: stdout chunk (state=3): >>>import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a6f2850> <<< 26764 1726882716.63980: stdout chunk (state=3): >>>import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a6cf940> <<< 26764 1726882716.64023: stdout chunk (state=3): >>>import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a730880> <<< 26764 1726882716.64033: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py <<< 26764 1726882716.64036: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' <<< 26764 1726882716.64039: stdout chunk (state=3): >>>import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a6c8d90> <<< 26764 1726882716.64098: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' <<< 26764 1726882716.64101: stdout chunk (state=3): >>>import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a6f2d90> <<< 26764 1726882716.64160: stdout chunk (state=3): >>>import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a718970> <<< 26764 1726882716.64188: stdout chunk (state=3): >>>Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 26764 1726882716.64387: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py <<< 26764 1726882716.64394: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' <<< 26764 1726882716.64422: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' <<< 26764 1726882716.64452: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py <<< 26764 1726882716.64460: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' <<< 26764 1726882716.64484: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py <<< 26764 1726882716.64501: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' <<< 26764 1726882716.64508: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a66eeb0> <<< 26764 1726882716.64550: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a670f40> <<< 26764 1726882716.64576: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py <<< 26764 1726882716.64582: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' <<< 26764 1726882716.64609: stdout chunk (state=3): >>>import '_sre' # <<< 26764 1726882716.64615: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py <<< 26764 1726882716.64649: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' <<< 26764 1726882716.64656: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py <<< 26764 1726882716.64659: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' <<< 26764 1726882716.64694: stdout chunk (state=3): >>>import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a667610> <<< 26764 1726882716.64698: stdout chunk (state=3): >>>import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a66d640> <<< 26764 1726882716.64703: stdout chunk (state=3): >>>import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a66e370> <<< 26764 1726882716.64727: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py <<< 26764 1726882716.64796: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' <<< 26764 1726882716.64822: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py <<< 26764 1726882716.64846: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' <<< 26764 1726882716.64870: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' <<< 26764 1726882716.64913: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f132a3d4e20> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a3d4910> <<< 26764 1726882716.64918: stdout chunk (state=3): >>>import 'itertools' # <<< 26764 1726882716.64941: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py <<< 26764 1726882716.64956: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a3d4f10> <<< 26764 1726882716.64970: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py <<< 26764 1726882716.64979: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' <<< 26764 1726882716.65033: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a3d4fd0> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py <<< 26764 1726882716.65036: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' <<< 26764 1726882716.65050: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a3e70d0> <<< 26764 1726882716.65055: stdout chunk (state=3): >>>import '_collections' # <<< 26764 1726882716.65102: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a649d90> <<< 26764 1726882716.65108: stdout chunk (state=3): >>>import '_functools' # <<< 26764 1726882716.65133: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a642670> <<< 26764 1726882716.65189: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py <<< 26764 1726882716.65207: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a6546d0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a675e20> <<< 26764 1726882716.65210: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py <<< 26764 1726882716.65213: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' <<< 26764 1726882716.65251: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' <<< 26764 1726882716.65257: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f132a3e7cd0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a6492b0> <<< 26764 1726882716.65291: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' <<< 26764 1726882716.65297: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f132a6542e0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a67b9d0> <<< 26764 1726882716.65323: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py <<< 26764 1726882716.65330: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' <<< 26764 1726882716.65353: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' <<< 26764 1726882716.65380: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py <<< 26764 1726882716.65399: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' <<< 26764 1726882716.65404: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a3e7eb0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a3e7df0> <<< 26764 1726882716.65438: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' <<< 26764 1726882716.65447: stdout chunk (state=3): >>>import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a3e7d60> <<< 26764 1726882716.65458: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py <<< 26764 1726882716.65464: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' <<< 26764 1726882716.65491: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py <<< 26764 1726882716.65497: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' <<< 26764 1726882716.65520: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py <<< 26764 1726882716.65573: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' <<< 26764 1726882716.65594: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' <<< 26764 1726882716.65604: stdout chunk (state=3): >>>import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a3ba3d0> <<< 26764 1726882716.65607: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py <<< 26764 1726882716.65630: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' <<< 26764 1726882716.65670: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a3ba4c0> <<< 26764 1726882716.65787: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a3eef40> <<< 26764 1726882716.65823: stdout chunk (state=3): >>>import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a3e9a90> <<< 26764 1726882716.65828: stdout chunk (state=3): >>>import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a3e9490> <<< 26764 1726882716.65845: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py <<< 26764 1726882716.65873: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' <<< 26764 1726882716.65889: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py <<< 26764 1726882716.65922: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' <<< 26764 1726882716.65932: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py <<< 26764 1726882716.65935: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' <<< 26764 1726882716.65938: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a2ee220> <<< 26764 1726882716.65984: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a3a5520> <<< 26764 1726882716.66027: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a3e9f10> <<< 26764 1726882716.66034: stdout chunk (state=3): >>>import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a67b040> <<< 26764 1726882716.66058: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py <<< 26764 1726882716.66142: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' <<< 26764 1726882716.66146: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py <<< 26764 1726882716.66149: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a300b50> import 'errno' # <<< 26764 1726882716.66151: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f132a300e80> <<< 26764 1726882716.66182: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py <<< 26764 1726882716.66200: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' <<< 26764 1726882716.66206: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py <<< 26764 1726882716.66212: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' <<< 26764 1726882716.66778: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a311790> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a311cd0> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f132a29f400> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a300f70> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f132a2b02e0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a311610> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f132a2b03a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a3e7a30> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f132a2cb700> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f132a2cb9d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a2cb7c0> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f132a2cb8b0> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 26764 1726882716.67324: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f132a2cbd00> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f132a2d6250> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a2cb940> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a2bfa90> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a3e7610> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a2cbaf0> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc'<<< 26764 1726882716.67332: stdout chunk (state=3): >>> <<< 26764 1726882716.67357: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f132a1ef6d0><<< 26764 1726882716.67360: stdout chunk (state=3): >>> <<< 26764 1726882716.67608: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_5t8qsq6j/ansible_stat_payload.zip' <<< 26764 1726882716.67614: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882716.67749: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882716.67788: stdout chunk (state=3): >>>import ansible # loaded from Zip /tmp/ansible_stat_payload_5t8qsq6j/ansible_stat_payload.zip/ansible/__init__.py # zipimport: zlib available <<< 26764 1726882716.67811: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882716.67839: stdout chunk (state=3): >>>import ansible.module_utils # loaded from Zip /tmp/ansible_stat_payload_5t8qsq6j/ansible_stat_payload.zip/ansible/module_utils/__init__.py <<< 26764 1726882716.67844: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882716.69485: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882716.70631: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py <<< 26764 1726882716.70643: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' <<< 26764 1726882716.70649: stdout chunk (state=3): >>>import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1329bd6820> <<< 26764 1726882716.70689: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py <<< 26764 1726882716.70694: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' <<< 26764 1726882716.70739: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py <<< 26764 1726882716.70745: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' <<< 26764 1726882716.70786: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py <<< 26764 1726882716.70791: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' <<< 26764 1726882716.70833: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' <<< 26764 1726882716.70845: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' <<< 26764 1726882716.70849: stdout chunk (state=3): >>>import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1329bd6160> <<< 26764 1726882716.70916: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1329bd6280> <<< 26764 1726882716.70973: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1329bd6f70> <<< 26764 1726882716.71004: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py <<< 26764 1726882716.71014: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' <<< 26764 1726882716.71074: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1329bd64f0> <<< 26764 1726882716.71096: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1329bd6d90> import 'atexit' # <<< 26764 1726882716.71137: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' <<< 26764 1726882716.71140: stdout chunk (state=3): >>># extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' <<< 26764 1726882716.71143: stdout chunk (state=3): >>>import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1329bd6fd0> <<< 26764 1726882716.71183: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py <<< 26764 1726882716.71236: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' <<< 26764 1726882716.71295: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1329bd6100> <<< 26764 1726882716.71330: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py <<< 26764 1726882716.71353: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' <<< 26764 1726882716.71388: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py <<< 26764 1726882716.71422: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' <<< 26764 1726882716.71527: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py <<< 26764 1726882716.71533: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' <<< 26764 1726882716.71660: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1329b2df40> <<< 26764 1726882716.71713: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' <<< 26764 1726882716.71716: stdout chunk (state=3): >>># extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' <<< 26764 1726882716.71719: stdout chunk (state=3): >>>import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1329b4cd00> <<< 26764 1726882716.71775: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' <<< 26764 1726882716.71779: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' <<< 26764 1726882716.71785: stdout chunk (state=3): >>>import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1329b4ceb0> <<< 26764 1726882716.71813: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py <<< 26764 1726882716.71862: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' <<< 26764 1726882716.71913: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1329b4c370> <<< 26764 1726882716.71944: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a17bdc0> <<< 26764 1726882716.72216: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a17b3a0> <<< 26764 1726882716.72253: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py <<< 26764 1726882716.72258: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' <<< 26764 1726882716.72292: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a17bfd0> <<< 26764 1726882716.72317: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py <<< 26764 1726882716.72342: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' <<< 26764 1726882716.72377: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py <<< 26764 1726882716.72382: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' <<< 26764 1726882716.72411: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py <<< 26764 1726882716.72444: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' <<< 26764 1726882716.72479: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py <<< 26764 1726882716.72490: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' <<< 26764 1726882716.72498: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a14cd30> <<< 26764 1726882716.72619: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1329ba9d30> <<< 26764 1726882716.72625: stdout chunk (state=3): >>>import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1329ba9400> <<< 26764 1726882716.72647: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1329bdf4f0> <<< 26764 1726882716.72694: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' <<< 26764 1726882716.72698: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1329ba9520> <<< 26764 1726882716.72752: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' <<< 26764 1726882716.72758: stdout chunk (state=3): >>>import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1329ba9550> <<< 26764 1726882716.72796: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py <<< 26764 1726882716.72828: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' <<< 26764 1726882716.72852: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py <<< 26764 1726882716.72908: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' <<< 26764 1726882716.73022: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1329b1dfd0> <<< 26764 1726882716.73043: stdout chunk (state=3): >>>import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a15d250> <<< 26764 1726882716.73056: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py <<< 26764 1726882716.73084: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' <<< 26764 1726882716.73174: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' <<< 26764 1726882716.73178: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1329b1a850> <<< 26764 1726882716.73191: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a15d3d0> <<< 26764 1726882716.73210: stdout chunk (state=3): >>># /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py <<< 26764 1726882716.73276: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' <<< 26764 1726882716.73305: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py <<< 26764 1726882716.73334: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' <<< 26764 1726882716.73338: stdout chunk (state=3): >>>import '_string' # <<< 26764 1726882716.73426: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a175e50> <<< 26764 1726882716.73637: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1329b1a7f0> <<< 26764 1726882716.73773: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' <<< 26764 1726882716.73791: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1329b1a640> <<< 26764 1726882716.73828: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' <<< 26764 1726882716.73842: stdout chunk (state=3): >>># extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1329b195b0> <<< 26764 1726882716.73927: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' <<< 26764 1726882716.73943: stdout chunk (state=3): >>># extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1329b0ed90> <<< 26764 1726882716.73966: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a154910> <<< 26764 1726882716.73980: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' <<< 26764 1726882716.74012: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py <<< 26764 1726882716.74039: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' <<< 26764 1726882716.74104: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' <<< 26764 1726882716.74123: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1329b9f6a0> <<< 26764 1726882716.74415: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' <<< 26764 1726882716.74441: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1329b9db20> <<< 26764 1726882716.74456: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1329bad0a0> <<< 26764 1726882716.74507: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' <<< 26764 1726882716.74511: stdout chunk (state=3): >>># extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1329b9f100> <<< 26764 1726882716.74529: stdout chunk (state=3): >>>import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1329be2b20> <<< 26764 1726882716.74542: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882716.74580: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882716.74600: stdout chunk (state=3): >>>import ansible.module_utils.compat # loaded from Zip /tmp/ansible_stat_payload_5t8qsq6j/ansible_stat_payload.zip/ansible/module_utils/compat/__init__.py <<< 26764 1726882716.74614: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882716.74722: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882716.74841: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882716.74883: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882716.74887: stdout chunk (state=3): >>>import ansible.module_utils.common # loaded from Zip /tmp/ansible_stat_payload_5t8qsq6j/ansible_stat_payload.zip/ansible/module_utils/common/__init__.py <<< 26764 1726882716.74926: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 26764 1726882716.74930: stdout chunk (state=3): >>>import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_stat_payload_5t8qsq6j/ansible_stat_payload.zip/ansible/module_utils/common/text/__init__.py <<< 26764 1726882716.74952: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882716.75106: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882716.75255: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882716.76030: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882716.76803: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_stat_payload_5t8qsq6j/ansible_stat_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # <<< 26764 1726882716.76827: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_stat_payload_5t8qsq6j/ansible_stat_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' <<< 26764 1726882716.76891: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f13297055e0> <<< 26764 1726882716.76958: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1329ae7580> <<< 26764 1726882716.76972: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13296a6100> <<< 26764 1726882716.77026: stdout chunk (state=3): >>>import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_stat_payload_5t8qsq6j/ansible_stat_payload.zip/ansible/module_utils/compat/selinux.py <<< 26764 1726882716.77066: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882716.77070: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_stat_payload_5t8qsq6j/ansible_stat_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available <<< 26764 1726882716.77192: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882716.77347: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1329b9db80> <<< 26764 1726882716.77350: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882716.77732: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882716.78105: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882716.78149: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882716.78703: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_stat_payload_5t8qsq6j/ansible_stat_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_stat_payload_5t8qsq6j/ansible_stat_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_stat_payload_5t8qsq6j/ansible_stat_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_stat_payload_5t8qsq6j/ansible_stat_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_stat_payload_5t8qsq6j/ansible_stat_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available <<< 26764 1726882716.78781: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882716.78957: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py <<< 26764 1726882716.78995: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # <<< 26764 1726882716.79066: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13296d7f10> <<< 26764 1726882716.79081: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882716.79127: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882716.79225: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_stat_payload_5t8qsq6j/ansible_stat_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_stat_payload_5t8qsq6j/ansible_stat_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_stat_payload_5t8qsq6j/ansible_stat_payload.zip/ansible/module_utils/common/parameters.py <<< 26764 1726882716.79231: stdout chunk (state=3): >>>import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_stat_payload_5t8qsq6j/ansible_stat_payload.zip/ansible/module_utils/common/arg_spec.py <<< 26764 1726882716.79249: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882716.79260: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882716.79308: stdout chunk (state=3): >>>import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_stat_payload_5t8qsq6j/ansible_stat_payload.zip/ansible/module_utils/common/locale.py <<< 26764 1726882716.79311: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882716.79348: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882716.79382: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882716.79480: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882716.79532: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py <<< 26764 1726882716.79567: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' <<< 26764 1726882716.79633: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f132a168220> <<< 26764 1726882716.79708: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13296d7850> <<< 26764 1726882716.79711: stdout chunk (state=3): >>>import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_stat_payload_5t8qsq6j/ansible_stat_payload.zip/ansible/module_utils/common/file.py <<< 26764 1726882716.79724: stdout chunk (state=3): >>>import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_stat_payload_5t8qsq6j/ansible_stat_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available <<< 26764 1726882716.79842: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882716.79908: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882716.79921: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882716.79977: stdout chunk (state=3): >>># /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py <<< 26764 1726882716.79981: stdout chunk (state=3): >>># code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' <<< 26764 1726882716.79997: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py <<< 26764 1726882716.80018: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' <<< 26764 1726882716.80048: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' <<< 26764 1726882716.80148: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1329addca0> <<< 26764 1726882716.80184: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1329ad9f70> <<< 26764 1726882716.80252: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1329ad2940> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_stat_payload_5t8qsq6j/ansible_stat_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available <<< 26764 1726882716.80295: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882716.80311: stdout chunk (state=3): >>>import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_stat_payload_5t8qsq6j/ansible_stat_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_stat_payload_5t8qsq6j/ansible_stat_payload.zip/ansible/module_utils/common/sys_info.py <<< 26764 1726882716.80397: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_stat_payload_5t8qsq6j/ansible_stat_payload.zip/ansible/module_utils/basic.py <<< 26764 1726882716.80412: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_stat_payload_5t8qsq6j/ansible_stat_payload.zip/ansible/modules/__init__.py <<< 26764 1726882716.80415: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882716.80522: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882716.80704: stdout chunk (state=3): >>># zipimport: zlib available <<< 26764 1726882716.80832: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 26764 1726882716.80845: stdout chunk (state=3): >>># destroy __main__ <<< 26764 1726882716.81185: stdout chunk (state=3): >>># clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings <<< 26764 1726882716.81206: stdout chunk (state=3): >>># cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 <<< 26764 1726882716.81214: stdout chunk (state=3): >>># destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 26764 1726882716.82294: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy array # destroy datetime # destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks <<< 26764 1726882716.82479: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 26764 1726882716.82520: stderr chunk (state=3): >>><<< 26764 1726882716.82523: stdout chunk (state=3): >>><<< 26764 1726882716.82609: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a773dc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a7183a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a773b20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a773ac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a718490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a718940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a718670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a6cf190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a6cf220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a6f2850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a6cf940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a730880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a6c8d90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a6f2d90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a718970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a66eeb0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a670f40> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a667610> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a66d640> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a66e370> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f132a3d4e20> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a3d4910> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a3d4f10> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a3d4fd0> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a3e70d0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a649d90> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a642670> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a6546d0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a675e20> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f132a3e7cd0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a6492b0> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f132a6542e0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a67b9d0> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a3e7eb0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a3e7df0> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a3e7d60> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a3ba3d0> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a3ba4c0> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a3eef40> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a3e9a90> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a3e9490> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a2ee220> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a3a5520> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a3e9f10> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a67b040> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a300b50> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f132a300e80> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a311790> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a311cd0> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f132a29f400> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a300f70> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f132a2b02e0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a311610> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f132a2b03a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a3e7a30> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f132a2cb700> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f132a2cb9d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a2cb7c0> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f132a2cb8b0> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f132a2cbd00> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f132a2d6250> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a2cb940> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a2bfa90> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a3e7610> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a2cbaf0> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f132a1ef6d0> # zipimport: found 30 names in '/tmp/ansible_stat_payload_5t8qsq6j/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_stat_payload_5t8qsq6j/ansible_stat_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_stat_payload_5t8qsq6j/ansible_stat_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1329bd6820> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1329bd6160> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1329bd6280> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1329bd6f70> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1329bd64f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1329bd6d90> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1329bd6fd0> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1329bd6100> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1329b2df40> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1329b4cd00> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1329b4ceb0> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1329b4c370> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a17bdc0> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a17b3a0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a17bfd0> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a14cd30> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1329ba9d30> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1329ba9400> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1329bdf4f0> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1329ba9520> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1329ba9550> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1329b1dfd0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a15d250> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1329b1a850> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a15d3d0> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a175e50> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1329b1a7f0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1329b1a640> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1329b195b0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1329b0ed90> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f132a154910> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1329b9f6a0> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1329b9db20> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1329bad0a0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1329b9f100> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1329be2b20> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_stat_payload_5t8qsq6j/ansible_stat_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_stat_payload_5t8qsq6j/ansible_stat_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_stat_payload_5t8qsq6j/ansible_stat_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_stat_payload_5t8qsq6j/ansible_stat_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_stat_payload_5t8qsq6j/ansible_stat_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f13297055e0> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1329ae7580> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13296a6100> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_stat_payload_5t8qsq6j/ansible_stat_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_stat_payload_5t8qsq6j/ansible_stat_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1329b9db80> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_stat_payload_5t8qsq6j/ansible_stat_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_stat_payload_5t8qsq6j/ansible_stat_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_stat_payload_5t8qsq6j/ansible_stat_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_stat_payload_5t8qsq6j/ansible_stat_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_stat_payload_5t8qsq6j/ansible_stat_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13296d7f10> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_stat_payload_5t8qsq6j/ansible_stat_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_stat_payload_5t8qsq6j/ansible_stat_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_stat_payload_5t8qsq6j/ansible_stat_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_stat_payload_5t8qsq6j/ansible_stat_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_stat_payload_5t8qsq6j/ansible_stat_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f132a168220> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13296d7850> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_stat_payload_5t8qsq6j/ansible_stat_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_stat_payload_5t8qsq6j/ansible_stat_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1329addca0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1329ad9f70> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1329ad2940> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_stat_payload_5t8qsq6j/ansible_stat_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_stat_payload_5t8qsq6j/ansible_stat_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_stat_payload_5t8qsq6j/ansible_stat_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_stat_payload_5t8qsq6j/ansible_stat_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_stat_payload_5t8qsq6j/ansible_stat_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy array # destroy datetime # destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy array # destroy datetime # destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks 26764 1726882716.83226: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882716.09062-26850-59271690699949/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 26764 1726882716.83229: _low_level_execute_command(): starting 26764 1726882716.83232: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882716.09062-26850-59271690699949/ > /dev/null 2>&1 && sleep 0' 26764 1726882716.83587: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26764 1726882716.83626: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882716.83640: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882716.84972: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882716.84975: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882716.84978: stderr chunk (state=3): >>>debug2: match not found <<< 26764 1726882716.84980: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882716.84988: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882716.84991: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882716.84993: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26764 1726882716.84995: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882716.84997: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882716.84999: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882716.85001: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882716.85003: stderr chunk (state=3): >>>debug2: match found <<< 26764 1726882716.85005: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882716.85007: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882716.85009: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882716.85221: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882716.85418: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882716.87294: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882716.87298: stderr chunk (state=3): >>><<< 26764 1726882716.87301: stdout chunk (state=3): >>><<< 26764 1726882716.87320: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882716.87327: handler run complete 26764 1726882716.87348: attempt loop complete, returning result 26764 1726882716.87351: _execute() done 26764 1726882716.87353: dumping result to json 26764 1726882716.87356: done dumping result, returning 26764 1726882716.87370: done running TaskExecutor() for managed_node2/TASK: Check if system is ostree [0e448fcc-3ce9-9875-c9a3-00000000004a] 26764 1726882716.87377: sending task result for task 0e448fcc-3ce9-9875-c9a3-00000000004a 26764 1726882716.87472: done sending task result for task 0e448fcc-3ce9-9875-c9a3-00000000004a 26764 1726882716.87475: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 26764 1726882716.87529: no more pending results, returning what we have 26764 1726882716.87531: results queue empty 26764 1726882716.87532: checking for any_errors_fatal 26764 1726882716.87537: done checking for any_errors_fatal 26764 1726882716.87538: checking for max_fail_percentage 26764 1726882716.87539: done checking for max_fail_percentage 26764 1726882716.87540: checking to see if all hosts have failed and the running result is not ok 26764 1726882716.87541: done checking to see if all hosts have failed 26764 1726882716.87542: getting the remaining hosts for this loop 26764 1726882716.87543: done getting the remaining hosts for this loop 26764 1726882716.87546: getting the next task for host managed_node2 26764 1726882716.87552: done getting next task for host managed_node2 26764 1726882716.87554: ^ task is: TASK: Set flag to indicate system is ostree 26764 1726882716.87556: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882716.87559: getting variables 26764 1726882716.87561: in VariableManager get_vars() 26764 1726882716.87592: Calling all_inventory to load vars for managed_node2 26764 1726882716.87595: Calling groups_inventory to load vars for managed_node2 26764 1726882716.87598: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882716.87610: Calling all_plugins_play to load vars for managed_node2 26764 1726882716.87612: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882716.87615: Calling groups_plugins_play to load vars for managed_node2 26764 1726882716.87774: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882716.87972: done with get_vars() 26764 1726882716.87984: done getting variables 26764 1726882716.88085: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Friday 20 September 2024 21:38:36 -0400 (0:00:00.869) 0:00:02.822 ****** 26764 1726882716.88117: entering _queue_task() for managed_node2/set_fact 26764 1726882716.88119: Creating lock for set_fact 26764 1726882716.88877: worker is 1 (out of 1 available) 26764 1726882716.88891: exiting _queue_task() for managed_node2/set_fact 26764 1726882716.88902: done queuing things up, now waiting for results queue to drain 26764 1726882716.88903: waiting for pending results... 26764 1726882716.89480: running TaskExecutor() for managed_node2/TASK: Set flag to indicate system is ostree 26764 1726882716.89706: in run() - task 0e448fcc-3ce9-9875-c9a3-00000000004b 26764 1726882716.89716: variable 'ansible_search_path' from source: unknown 26764 1726882716.89719: variable 'ansible_search_path' from source: unknown 26764 1726882716.89756: calling self._execute() 26764 1726882716.89938: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882716.89942: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882716.89954: variable 'omit' from source: magic vars 26764 1726882716.91034: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 26764 1726882716.91486: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 26764 1726882716.91644: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 26764 1726882716.91683: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 26764 1726882716.91715: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 26764 1726882716.91918: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 26764 1726882716.91947: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 26764 1726882716.91977: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 26764 1726882716.92003: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 26764 1726882716.92238: Evaluated conditional (not __network_is_ostree is defined): True 26764 1726882716.92244: variable 'omit' from source: magic vars 26764 1726882716.92462: variable 'omit' from source: magic vars 26764 1726882716.92710: variable '__ostree_booted_stat' from source: set_fact 26764 1726882716.92757: variable 'omit' from source: magic vars 26764 1726882716.92834: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26764 1726882716.92861: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26764 1726882716.92881: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26764 1726882716.92897: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882716.92908: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882716.93105: variable 'inventory_hostname' from source: host vars for 'managed_node2' 26764 1726882716.93108: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882716.93111: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882716.93321: Set connection var ansible_shell_executable to /bin/sh 26764 1726882716.93324: Set connection var ansible_shell_type to sh 26764 1726882716.93334: Set connection var ansible_module_compression to ZIP_DEFLATED 26764 1726882716.93339: Set connection var ansible_timeout to 10 26764 1726882716.93345: Set connection var ansible_connection to ssh 26764 1726882716.93350: Set connection var ansible_pipelining to False 26764 1726882716.93428: variable 'ansible_shell_executable' from source: unknown 26764 1726882716.93432: variable 'ansible_connection' from source: unknown 26764 1726882716.93435: variable 'ansible_module_compression' from source: unknown 26764 1726882716.93437: variable 'ansible_shell_type' from source: unknown 26764 1726882716.93439: variable 'ansible_shell_executable' from source: unknown 26764 1726882716.93441: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882716.93443: variable 'ansible_pipelining' from source: unknown 26764 1726882716.93447: variable 'ansible_timeout' from source: unknown 26764 1726882716.93451: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882716.93656: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 26764 1726882716.93668: variable 'omit' from source: magic vars 26764 1726882716.93675: starting attempt loop 26764 1726882716.93678: running the handler 26764 1726882716.93774: handler run complete 26764 1726882716.93786: attempt loop complete, returning result 26764 1726882716.93789: _execute() done 26764 1726882716.93792: dumping result to json 26764 1726882716.93835: done dumping result, returning 26764 1726882716.93844: done running TaskExecutor() for managed_node2/TASK: Set flag to indicate system is ostree [0e448fcc-3ce9-9875-c9a3-00000000004b] 26764 1726882716.93847: sending task result for task 0e448fcc-3ce9-9875-c9a3-00000000004b 26764 1726882716.93938: done sending task result for task 0e448fcc-3ce9-9875-c9a3-00000000004b 26764 1726882716.93941: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 26764 1726882716.94000: no more pending results, returning what we have 26764 1726882716.94003: results queue empty 26764 1726882716.94004: checking for any_errors_fatal 26764 1726882716.94009: done checking for any_errors_fatal 26764 1726882716.94010: checking for max_fail_percentage 26764 1726882716.94012: done checking for max_fail_percentage 26764 1726882716.94014: checking to see if all hosts have failed and the running result is not ok 26764 1726882716.94014: done checking to see if all hosts have failed 26764 1726882716.94015: getting the remaining hosts for this loop 26764 1726882716.94017: done getting the remaining hosts for this loop 26764 1726882716.94020: getting the next task for host managed_node2 26764 1726882716.94028: done getting next task for host managed_node2 26764 1726882716.94031: ^ task is: TASK: Fix CentOS6 Base repo 26764 1726882716.94033: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882716.94036: getting variables 26764 1726882716.94038: in VariableManager get_vars() 26764 1726882716.94067: Calling all_inventory to load vars for managed_node2 26764 1726882716.94070: Calling groups_inventory to load vars for managed_node2 26764 1726882716.94073: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882716.94084: Calling all_plugins_play to load vars for managed_node2 26764 1726882716.94086: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882716.94094: Calling groups_plugins_play to load vars for managed_node2 26764 1726882716.94301: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882716.94517: done with get_vars() 26764 1726882716.94527: done getting variables 26764 1726882716.94760: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Friday 20 September 2024 21:38:36 -0400 (0:00:00.066) 0:00:02.889 ****** 26764 1726882716.94794: entering _queue_task() for managed_node2/copy 26764 1726882716.95597: worker is 1 (out of 1 available) 26764 1726882716.95610: exiting _queue_task() for managed_node2/copy 26764 1726882716.95623: done queuing things up, now waiting for results queue to drain 26764 1726882716.95624: waiting for pending results... 26764 1726882716.95986: running TaskExecutor() for managed_node2/TASK: Fix CentOS6 Base repo 26764 1726882716.96176: in run() - task 0e448fcc-3ce9-9875-c9a3-00000000004d 26764 1726882716.96186: variable 'ansible_search_path' from source: unknown 26764 1726882716.96190: variable 'ansible_search_path' from source: unknown 26764 1726882716.96383: calling self._execute() 26764 1726882716.96522: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882716.96526: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882716.96537: variable 'omit' from source: magic vars 26764 1726882716.97546: variable 'ansible_distribution' from source: facts 26764 1726882716.97568: Evaluated conditional (ansible_distribution == 'CentOS'): True 26764 1726882716.97723: variable 'ansible_distribution_major_version' from source: facts 26764 1726882716.97730: Evaluated conditional (ansible_distribution_major_version == '6'): False 26764 1726882716.97732: when evaluation is False, skipping this task 26764 1726882716.97735: _execute() done 26764 1726882716.97739: dumping result to json 26764 1726882716.97859: done dumping result, returning 26764 1726882716.97870: done running TaskExecutor() for managed_node2/TASK: Fix CentOS6 Base repo [0e448fcc-3ce9-9875-c9a3-00000000004d] 26764 1726882716.97877: sending task result for task 0e448fcc-3ce9-9875-c9a3-00000000004d 26764 1726882716.97970: done sending task result for task 0e448fcc-3ce9-9875-c9a3-00000000004d 26764 1726882716.97973: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 26764 1726882716.98036: no more pending results, returning what we have 26764 1726882716.98039: results queue empty 26764 1726882716.98040: checking for any_errors_fatal 26764 1726882716.98044: done checking for any_errors_fatal 26764 1726882716.98045: checking for max_fail_percentage 26764 1726882716.98046: done checking for max_fail_percentage 26764 1726882716.98047: checking to see if all hosts have failed and the running result is not ok 26764 1726882716.98048: done checking to see if all hosts have failed 26764 1726882716.98049: getting the remaining hosts for this loop 26764 1726882716.98050: done getting the remaining hosts for this loop 26764 1726882716.98053: getting the next task for host managed_node2 26764 1726882716.98061: done getting next task for host managed_node2 26764 1726882716.98066: ^ task is: TASK: Include the task 'enable_epel.yml' 26764 1726882716.98069: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882716.98073: getting variables 26764 1726882716.98075: in VariableManager get_vars() 26764 1726882716.98105: Calling all_inventory to load vars for managed_node2 26764 1726882716.98108: Calling groups_inventory to load vars for managed_node2 26764 1726882716.98112: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882716.98125: Calling all_plugins_play to load vars for managed_node2 26764 1726882716.98129: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882716.98132: Calling groups_plugins_play to load vars for managed_node2 26764 1726882716.98294: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882716.98518: done with get_vars() 26764 1726882716.98528: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Friday 20 September 2024 21:38:36 -0400 (0:00:00.038) 0:00:02.927 ****** 26764 1726882716.98622: entering _queue_task() for managed_node2/include_tasks 26764 1726882716.99980: worker is 1 (out of 1 available) 26764 1726882716.99994: exiting _queue_task() for managed_node2/include_tasks 26764 1726882717.00202: done queuing things up, now waiting for results queue to drain 26764 1726882717.00204: waiting for pending results... 26764 1726882717.00677: running TaskExecutor() for managed_node2/TASK: Include the task 'enable_epel.yml' 26764 1726882717.00858: in run() - task 0e448fcc-3ce9-9875-c9a3-00000000004e 26764 1726882717.00961: variable 'ansible_search_path' from source: unknown 26764 1726882717.00966: variable 'ansible_search_path' from source: unknown 26764 1726882717.01002: calling self._execute() 26764 1726882717.01166: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882717.01174: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882717.01184: variable 'omit' from source: magic vars 26764 1726882717.02261: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 26764 1726882717.08203: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 26764 1726882717.08379: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 26764 1726882717.08412: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 26764 1726882717.08443: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 26764 1726882717.08587: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 26764 1726882717.08659: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26764 1726882717.08813: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26764 1726882717.08838: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26764 1726882717.09021: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26764 1726882717.09037: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26764 1726882717.09266: variable '__network_is_ostree' from source: set_fact 26764 1726882717.09285: Evaluated conditional (not __network_is_ostree | d(false)): True 26764 1726882717.09292: _execute() done 26764 1726882717.09295: dumping result to json 26764 1726882717.09297: done dumping result, returning 26764 1726882717.09304: done running TaskExecutor() for managed_node2/TASK: Include the task 'enable_epel.yml' [0e448fcc-3ce9-9875-c9a3-00000000004e] 26764 1726882717.09309: sending task result for task 0e448fcc-3ce9-9875-c9a3-00000000004e 26764 1726882717.09519: done sending task result for task 0e448fcc-3ce9-9875-c9a3-00000000004e 26764 1726882717.09522: WORKER PROCESS EXITING 26764 1726882717.09551: no more pending results, returning what we have 26764 1726882717.09556: in VariableManager get_vars() 26764 1726882717.09591: Calling all_inventory to load vars for managed_node2 26764 1726882717.09593: Calling groups_inventory to load vars for managed_node2 26764 1726882717.09596: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882717.09608: Calling all_plugins_play to load vars for managed_node2 26764 1726882717.09611: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882717.09613: Calling groups_plugins_play to load vars for managed_node2 26764 1726882717.09771: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882717.09960: done with get_vars() 26764 1726882717.09971: variable 'ansible_search_path' from source: unknown 26764 1726882717.09972: variable 'ansible_search_path' from source: unknown 26764 1726882717.10013: we have included files to process 26764 1726882717.10014: generating all_blocks data 26764 1726882717.10015: done generating all_blocks data 26764 1726882717.10020: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 26764 1726882717.10021: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 26764 1726882717.10023: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 26764 1726882717.13149: done processing included file 26764 1726882717.13152: iterating over new_blocks loaded from include file 26764 1726882717.13154: in VariableManager get_vars() 26764 1726882717.13171: done with get_vars() 26764 1726882717.13174: filtering new block on tags 26764 1726882717.13201: done filtering new block on tags 26764 1726882717.13204: in VariableManager get_vars() 26764 1726882717.13217: done with get_vars() 26764 1726882717.13219: filtering new block on tags 26764 1726882717.13456: done filtering new block on tags 26764 1726882717.13459: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed_node2 26764 1726882717.13467: extending task lists for all hosts with included blocks 26764 1726882717.13797: done extending task lists 26764 1726882717.13798: done processing included files 26764 1726882717.13799: results queue empty 26764 1726882717.13800: checking for any_errors_fatal 26764 1726882717.13804: done checking for any_errors_fatal 26764 1726882717.13804: checking for max_fail_percentage 26764 1726882717.13806: done checking for max_fail_percentage 26764 1726882717.13807: checking to see if all hosts have failed and the running result is not ok 26764 1726882717.13808: done checking to see if all hosts have failed 26764 1726882717.13809: getting the remaining hosts for this loop 26764 1726882717.13810: done getting the remaining hosts for this loop 26764 1726882717.13812: getting the next task for host managed_node2 26764 1726882717.13817: done getting next task for host managed_node2 26764 1726882717.13819: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 26764 1726882717.13823: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882717.13825: getting variables 26764 1726882717.13826: in VariableManager get_vars() 26764 1726882717.13835: Calling all_inventory to load vars for managed_node2 26764 1726882717.13837: Calling groups_inventory to load vars for managed_node2 26764 1726882717.13839: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882717.13845: Calling all_plugins_play to load vars for managed_node2 26764 1726882717.13853: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882717.13856: Calling groups_plugins_play to load vars for managed_node2 26764 1726882717.14449: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882717.15485: done with get_vars() 26764 1726882717.15493: done getting variables 26764 1726882717.15782: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 26764 1726882717.16427: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 9] *********************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Friday 20 September 2024 21:38:37 -0400 (0:00:00.178) 0:00:03.106 ****** 26764 1726882717.16474: entering _queue_task() for managed_node2/command 26764 1726882717.16476: Creating lock for command 26764 1726882717.17436: worker is 1 (out of 1 available) 26764 1726882717.17448: exiting _queue_task() for managed_node2/command 26764 1726882717.17679: done queuing things up, now waiting for results queue to drain 26764 1726882717.17680: waiting for pending results... 26764 1726882717.18907: running TaskExecutor() for managed_node2/TASK: Create EPEL 9 26764 1726882717.19273: in run() - task 0e448fcc-3ce9-9875-c9a3-000000000068 26764 1726882717.19313: variable 'ansible_search_path' from source: unknown 26764 1726882717.19321: variable 'ansible_search_path' from source: unknown 26764 1726882717.19359: calling self._execute() 26764 1726882717.19441: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882717.19455: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882717.19478: variable 'omit' from source: magic vars 26764 1726882717.20069: variable 'ansible_distribution' from source: facts 26764 1726882717.20203: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 26764 1726882717.20385: variable 'ansible_distribution_major_version' from source: facts 26764 1726882717.20522: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 26764 1726882717.20530: when evaluation is False, skipping this task 26764 1726882717.20538: _execute() done 26764 1726882717.20545: dumping result to json 26764 1726882717.20551: done dumping result, returning 26764 1726882717.20560: done running TaskExecutor() for managed_node2/TASK: Create EPEL 9 [0e448fcc-3ce9-9875-c9a3-000000000068] 26764 1726882717.20575: sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000068 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 26764 1726882717.20743: no more pending results, returning what we have 26764 1726882717.20746: results queue empty 26764 1726882717.20747: checking for any_errors_fatal 26764 1726882717.20748: done checking for any_errors_fatal 26764 1726882717.20749: checking for max_fail_percentage 26764 1726882717.20751: done checking for max_fail_percentage 26764 1726882717.20751: checking to see if all hosts have failed and the running result is not ok 26764 1726882717.20752: done checking to see if all hosts have failed 26764 1726882717.20753: getting the remaining hosts for this loop 26764 1726882717.20754: done getting the remaining hosts for this loop 26764 1726882717.20757: getting the next task for host managed_node2 26764 1726882717.20767: done getting next task for host managed_node2 26764 1726882717.20769: ^ task is: TASK: Install yum-utils package 26764 1726882717.20773: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882717.20776: getting variables 26764 1726882717.20778: in VariableManager get_vars() 26764 1726882717.20806: Calling all_inventory to load vars for managed_node2 26764 1726882717.20809: Calling groups_inventory to load vars for managed_node2 26764 1726882717.20813: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882717.20828: Calling all_plugins_play to load vars for managed_node2 26764 1726882717.20832: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882717.20837: Calling groups_plugins_play to load vars for managed_node2 26764 1726882717.21003: done sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000068 26764 1726882717.21008: WORKER PROCESS EXITING 26764 1726882717.21028: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882717.21235: done with get_vars() 26764 1726882717.21243: done getting variables 26764 1726882717.21456: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Friday 20 September 2024 21:38:37 -0400 (0:00:00.050) 0:00:03.156 ****** 26764 1726882717.21488: entering _queue_task() for managed_node2/package 26764 1726882717.21490: Creating lock for package 26764 1726882717.22189: worker is 1 (out of 1 available) 26764 1726882717.22424: exiting _queue_task() for managed_node2/package 26764 1726882717.22437: done queuing things up, now waiting for results queue to drain 26764 1726882717.22438: waiting for pending results... 26764 1726882717.22924: running TaskExecutor() for managed_node2/TASK: Install yum-utils package 26764 1726882717.23140: in run() - task 0e448fcc-3ce9-9875-c9a3-000000000069 26764 1726882717.23157: variable 'ansible_search_path' from source: unknown 26764 1726882717.23168: variable 'ansible_search_path' from source: unknown 26764 1726882717.23320: calling self._execute() 26764 1726882717.23396: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882717.23412: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882717.23482: variable 'omit' from source: magic vars 26764 1726882717.24296: variable 'ansible_distribution' from source: facts 26764 1726882717.24388: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 26764 1726882717.24716: variable 'ansible_distribution_major_version' from source: facts 26764 1726882717.24727: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 26764 1726882717.24734: when evaluation is False, skipping this task 26764 1726882717.24740: _execute() done 26764 1726882717.24747: dumping result to json 26764 1726882717.24754: done dumping result, returning 26764 1726882717.24762: done running TaskExecutor() for managed_node2/TASK: Install yum-utils package [0e448fcc-3ce9-9875-c9a3-000000000069] 26764 1726882717.24777: sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000069 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 26764 1726882717.24921: no more pending results, returning what we have 26764 1726882717.24924: results queue empty 26764 1726882717.24925: checking for any_errors_fatal 26764 1726882717.24929: done checking for any_errors_fatal 26764 1726882717.24930: checking for max_fail_percentage 26764 1726882717.24932: done checking for max_fail_percentage 26764 1726882717.24933: checking to see if all hosts have failed and the running result is not ok 26764 1726882717.24933: done checking to see if all hosts have failed 26764 1726882717.24934: getting the remaining hosts for this loop 26764 1726882717.24935: done getting the remaining hosts for this loop 26764 1726882717.24938: getting the next task for host managed_node2 26764 1726882717.24945: done getting next task for host managed_node2 26764 1726882717.24947: ^ task is: TASK: Enable EPEL 7 26764 1726882717.24951: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882717.24954: getting variables 26764 1726882717.24955: in VariableManager get_vars() 26764 1726882717.25032: Calling all_inventory to load vars for managed_node2 26764 1726882717.25036: Calling groups_inventory to load vars for managed_node2 26764 1726882717.25040: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882717.25054: Calling all_plugins_play to load vars for managed_node2 26764 1726882717.25057: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882717.25061: Calling groups_plugins_play to load vars for managed_node2 26764 1726882717.25219: done sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000069 26764 1726882717.25223: WORKER PROCESS EXITING 26764 1726882717.25238: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882717.25430: done with get_vars() 26764 1726882717.25443: done getting variables 26764 1726882717.25502: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Friday 20 September 2024 21:38:37 -0400 (0:00:00.040) 0:00:03.196 ****** 26764 1726882717.25529: entering _queue_task() for managed_node2/command 26764 1726882717.26424: worker is 1 (out of 1 available) 26764 1726882717.26436: exiting _queue_task() for managed_node2/command 26764 1726882717.26448: done queuing things up, now waiting for results queue to drain 26764 1726882717.26449: waiting for pending results... 26764 1726882717.28054: running TaskExecutor() for managed_node2/TASK: Enable EPEL 7 26764 1726882717.28151: in run() - task 0e448fcc-3ce9-9875-c9a3-00000000006a 26764 1726882717.28174: variable 'ansible_search_path' from source: unknown 26764 1726882717.28181: variable 'ansible_search_path' from source: unknown 26764 1726882717.28221: calling self._execute() 26764 1726882717.28301: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882717.28312: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882717.28326: variable 'omit' from source: magic vars 26764 1726882717.28892: variable 'ansible_distribution' from source: facts 26764 1726882717.28959: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 26764 1726882717.29195: variable 'ansible_distribution_major_version' from source: facts 26764 1726882717.29241: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 26764 1726882717.29288: when evaluation is False, skipping this task 26764 1726882717.29296: _execute() done 26764 1726882717.29302: dumping result to json 26764 1726882717.29309: done dumping result, returning 26764 1726882717.29317: done running TaskExecutor() for managed_node2/TASK: Enable EPEL 7 [0e448fcc-3ce9-9875-c9a3-00000000006a] 26764 1726882717.29326: sending task result for task 0e448fcc-3ce9-9875-c9a3-00000000006a skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 26764 1726882717.29514: no more pending results, returning what we have 26764 1726882717.29518: results queue empty 26764 1726882717.29519: checking for any_errors_fatal 26764 1726882717.29524: done checking for any_errors_fatal 26764 1726882717.29525: checking for max_fail_percentage 26764 1726882717.29526: done checking for max_fail_percentage 26764 1726882717.29527: checking to see if all hosts have failed and the running result is not ok 26764 1726882717.29528: done checking to see if all hosts have failed 26764 1726882717.29529: getting the remaining hosts for this loop 26764 1726882717.29531: done getting the remaining hosts for this loop 26764 1726882717.29535: getting the next task for host managed_node2 26764 1726882717.29543: done getting next task for host managed_node2 26764 1726882717.29546: ^ task is: TASK: Enable EPEL 8 26764 1726882717.29550: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882717.29555: getting variables 26764 1726882717.29557: in VariableManager get_vars() 26764 1726882717.29586: Calling all_inventory to load vars for managed_node2 26764 1726882717.29589: Calling groups_inventory to load vars for managed_node2 26764 1726882717.29592: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882717.29607: Calling all_plugins_play to load vars for managed_node2 26764 1726882717.29611: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882717.29616: Calling groups_plugins_play to load vars for managed_node2 26764 1726882717.29787: done sending task result for task 0e448fcc-3ce9-9875-c9a3-00000000006a 26764 1726882717.29791: WORKER PROCESS EXITING 26764 1726882717.29805: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882717.30006: done with get_vars() 26764 1726882717.30015: done getting variables 26764 1726882717.30073: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Friday 20 September 2024 21:38:37 -0400 (0:00:00.045) 0:00:03.242 ****** 26764 1726882717.30102: entering _queue_task() for managed_node2/command 26764 1726882717.30893: worker is 1 (out of 1 available) 26764 1726882717.30905: exiting _queue_task() for managed_node2/command 26764 1726882717.30917: done queuing things up, now waiting for results queue to drain 26764 1726882717.30918: waiting for pending results... 26764 1726882717.31306: running TaskExecutor() for managed_node2/TASK: Enable EPEL 8 26764 1726882717.31518: in run() - task 0e448fcc-3ce9-9875-c9a3-00000000006b 26764 1726882717.31651: variable 'ansible_search_path' from source: unknown 26764 1726882717.31658: variable 'ansible_search_path' from source: unknown 26764 1726882717.31699: calling self._execute() 26764 1726882717.31987: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882717.32000: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882717.32082: variable 'omit' from source: magic vars 26764 1726882717.32714: variable 'ansible_distribution' from source: facts 26764 1726882717.32784: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 26764 1726882717.33033: variable 'ansible_distribution_major_version' from source: facts 26764 1726882717.33168: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 26764 1726882717.33179: when evaluation is False, skipping this task 26764 1726882717.33186: _execute() done 26764 1726882717.33193: dumping result to json 26764 1726882717.33201: done dumping result, returning 26764 1726882717.33209: done running TaskExecutor() for managed_node2/TASK: Enable EPEL 8 [0e448fcc-3ce9-9875-c9a3-00000000006b] 26764 1726882717.33218: sending task result for task 0e448fcc-3ce9-9875-c9a3-00000000006b skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 26764 1726882717.33357: no more pending results, returning what we have 26764 1726882717.33361: results queue empty 26764 1726882717.33361: checking for any_errors_fatal 26764 1726882717.33371: done checking for any_errors_fatal 26764 1726882717.33372: checking for max_fail_percentage 26764 1726882717.33374: done checking for max_fail_percentage 26764 1726882717.33375: checking to see if all hosts have failed and the running result is not ok 26764 1726882717.33375: done checking to see if all hosts have failed 26764 1726882717.33376: getting the remaining hosts for this loop 26764 1726882717.33378: done getting the remaining hosts for this loop 26764 1726882717.33381: getting the next task for host managed_node2 26764 1726882717.33391: done getting next task for host managed_node2 26764 1726882717.33393: ^ task is: TASK: Enable EPEL 6 26764 1726882717.33397: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882717.33400: getting variables 26764 1726882717.33402: in VariableManager get_vars() 26764 1726882717.33470: Calling all_inventory to load vars for managed_node2 26764 1726882717.33473: Calling groups_inventory to load vars for managed_node2 26764 1726882717.33477: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882717.33492: Calling all_plugins_play to load vars for managed_node2 26764 1726882717.33495: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882717.33498: Calling groups_plugins_play to load vars for managed_node2 26764 1726882717.33641: done sending task result for task 0e448fcc-3ce9-9875-c9a3-00000000006b 26764 1726882717.33645: WORKER PROCESS EXITING 26764 1726882717.33659: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882717.33849: done with get_vars() 26764 1726882717.33862: done getting variables 26764 1726882717.33920: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Friday 20 September 2024 21:38:37 -0400 (0:00:00.038) 0:00:03.281 ****** 26764 1726882717.33948: entering _queue_task() for managed_node2/copy 26764 1726882717.34737: worker is 1 (out of 1 available) 26764 1726882717.34750: exiting _queue_task() for managed_node2/copy 26764 1726882717.34761: done queuing things up, now waiting for results queue to drain 26764 1726882717.34762: waiting for pending results... 26764 1726882717.36286: running TaskExecutor() for managed_node2/TASK: Enable EPEL 6 26764 1726882717.36389: in run() - task 0e448fcc-3ce9-9875-c9a3-00000000006d 26764 1726882717.36584: variable 'ansible_search_path' from source: unknown 26764 1726882717.36591: variable 'ansible_search_path' from source: unknown 26764 1726882717.36628: calling self._execute() 26764 1726882717.36704: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882717.36779: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882717.36793: variable 'omit' from source: magic vars 26764 1726882717.37854: variable 'ansible_distribution' from source: facts 26764 1726882717.37878: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 26764 1726882717.37990: variable 'ansible_distribution_major_version' from source: facts 26764 1726882717.38000: Evaluated conditional (ansible_distribution_major_version == '6'): False 26764 1726882717.38008: when evaluation is False, skipping this task 26764 1726882717.38015: _execute() done 26764 1726882717.38021: dumping result to json 26764 1726882717.38028: done dumping result, returning 26764 1726882717.38037: done running TaskExecutor() for managed_node2/TASK: Enable EPEL 6 [0e448fcc-3ce9-9875-c9a3-00000000006d] 26764 1726882717.38046: sending task result for task 0e448fcc-3ce9-9875-c9a3-00000000006d skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 26764 1726882717.38196: no more pending results, returning what we have 26764 1726882717.38200: results queue empty 26764 1726882717.38201: checking for any_errors_fatal 26764 1726882717.38205: done checking for any_errors_fatal 26764 1726882717.38206: checking for max_fail_percentage 26764 1726882717.38208: done checking for max_fail_percentage 26764 1726882717.38209: checking to see if all hosts have failed and the running result is not ok 26764 1726882717.38210: done checking to see if all hosts have failed 26764 1726882717.38210: getting the remaining hosts for this loop 26764 1726882717.38212: done getting the remaining hosts for this loop 26764 1726882717.38215: getting the next task for host managed_node2 26764 1726882717.38225: done getting next task for host managed_node2 26764 1726882717.38229: ^ task is: TASK: Set network provider to 'nm' 26764 1726882717.38231: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882717.38236: getting variables 26764 1726882717.38237: in VariableManager get_vars() 26764 1726882717.38270: Calling all_inventory to load vars for managed_node2 26764 1726882717.38274: Calling groups_inventory to load vars for managed_node2 26764 1726882717.38278: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882717.38292: Calling all_plugins_play to load vars for managed_node2 26764 1726882717.38295: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882717.38298: Calling groups_plugins_play to load vars for managed_node2 26764 1726882717.38482: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882717.38683: done with get_vars() 26764 1726882717.38694: done getting variables 26764 1726882717.38769: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_reapply_nm.yml:13 Friday 20 September 2024 21:38:37 -0400 (0:00:00.048) 0:00:03.329 ****** 26764 1726882717.38801: entering _queue_task() for managed_node2/set_fact 26764 1726882717.38819: done sending task result for task 0e448fcc-3ce9-9875-c9a3-00000000006d 26764 1726882717.38828: WORKER PROCESS EXITING 26764 1726882717.39284: worker is 1 (out of 1 available) 26764 1726882717.39294: exiting _queue_task() for managed_node2/set_fact 26764 1726882717.39306: done queuing things up, now waiting for results queue to drain 26764 1726882717.39308: waiting for pending results... 26764 1726882717.40210: running TaskExecutor() for managed_node2/TASK: Set network provider to 'nm' 26764 1726882717.40337: in run() - task 0e448fcc-3ce9-9875-c9a3-000000000007 26764 1726882717.40429: variable 'ansible_search_path' from source: unknown 26764 1726882717.40472: calling self._execute() 26764 1726882717.40720: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882717.40857: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882717.40876: variable 'omit' from source: magic vars 26764 1726882717.40978: variable 'omit' from source: magic vars 26764 1726882717.41100: variable 'omit' from source: magic vars 26764 1726882717.41138: variable 'omit' from source: magic vars 26764 1726882717.41211: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26764 1726882717.41317: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26764 1726882717.41340: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26764 1726882717.41408: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882717.41423: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882717.41526: variable 'inventory_hostname' from source: host vars for 'managed_node2' 26764 1726882717.41535: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882717.41542: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882717.41751: Set connection var ansible_shell_executable to /bin/sh 26764 1726882717.41758: Set connection var ansible_shell_type to sh 26764 1726882717.41777: Set connection var ansible_module_compression to ZIP_DEFLATED 26764 1726882717.41786: Set connection var ansible_timeout to 10 26764 1726882717.41793: Set connection var ansible_connection to ssh 26764 1726882717.41801: Set connection var ansible_pipelining to False 26764 1726882717.41939: variable 'ansible_shell_executable' from source: unknown 26764 1726882717.41946: variable 'ansible_connection' from source: unknown 26764 1726882717.41953: variable 'ansible_module_compression' from source: unknown 26764 1726882717.41959: variable 'ansible_shell_type' from source: unknown 26764 1726882717.41969: variable 'ansible_shell_executable' from source: unknown 26764 1726882717.41976: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882717.41983: variable 'ansible_pipelining' from source: unknown 26764 1726882717.41989: variable 'ansible_timeout' from source: unknown 26764 1726882717.41996: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882717.42259: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 26764 1726882717.42283: variable 'omit' from source: magic vars 26764 1726882717.42295: starting attempt loop 26764 1726882717.42374: running the handler 26764 1726882717.42390: handler run complete 26764 1726882717.42404: attempt loop complete, returning result 26764 1726882717.42410: _execute() done 26764 1726882717.42416: dumping result to json 26764 1726882717.42424: done dumping result, returning 26764 1726882717.42435: done running TaskExecutor() for managed_node2/TASK: Set network provider to 'nm' [0e448fcc-3ce9-9875-c9a3-000000000007] 26764 1726882717.42444: sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000007 26764 1726882717.42661: done sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000007 26764 1726882717.42677: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 26764 1726882717.42791: no more pending results, returning what we have 26764 1726882717.42795: results queue empty 26764 1726882717.42796: checking for any_errors_fatal 26764 1726882717.42800: done checking for any_errors_fatal 26764 1726882717.42801: checking for max_fail_percentage 26764 1726882717.42803: done checking for max_fail_percentage 26764 1726882717.42804: checking to see if all hosts have failed and the running result is not ok 26764 1726882717.42805: done checking to see if all hosts have failed 26764 1726882717.42806: getting the remaining hosts for this loop 26764 1726882717.42807: done getting the remaining hosts for this loop 26764 1726882717.42811: getting the next task for host managed_node2 26764 1726882717.42819: done getting next task for host managed_node2 26764 1726882717.42821: ^ task is: TASK: meta (flush_handlers) 26764 1726882717.42823: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882717.42827: getting variables 26764 1726882717.42829: in VariableManager get_vars() 26764 1726882717.42855: Calling all_inventory to load vars for managed_node2 26764 1726882717.42858: Calling groups_inventory to load vars for managed_node2 26764 1726882717.42861: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882717.42878: Calling all_plugins_play to load vars for managed_node2 26764 1726882717.42882: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882717.42886: Calling groups_plugins_play to load vars for managed_node2 26764 1726882717.43031: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882717.43215: done with get_vars() 26764 1726882717.43225: done getting variables 26764 1726882717.43295: in VariableManager get_vars() 26764 1726882717.43302: Calling all_inventory to load vars for managed_node2 26764 1726882717.43304: Calling groups_inventory to load vars for managed_node2 26764 1726882717.43306: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882717.43310: Calling all_plugins_play to load vars for managed_node2 26764 1726882717.43312: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882717.43315: Calling groups_plugins_play to load vars for managed_node2 26764 1726882717.44347: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882717.44528: done with get_vars() 26764 1726882717.44540: done queuing things up, now waiting for results queue to drain 26764 1726882717.44542: results queue empty 26764 1726882717.44543: checking for any_errors_fatal 26764 1726882717.44545: done checking for any_errors_fatal 26764 1726882717.44546: checking for max_fail_percentage 26764 1726882717.44547: done checking for max_fail_percentage 26764 1726882717.44547: checking to see if all hosts have failed and the running result is not ok 26764 1726882717.44548: done checking to see if all hosts have failed 26764 1726882717.44549: getting the remaining hosts for this loop 26764 1726882717.44550: done getting the remaining hosts for this loop 26764 1726882717.44552: getting the next task for host managed_node2 26764 1726882717.44556: done getting next task for host managed_node2 26764 1726882717.44557: ^ task is: TASK: meta (flush_handlers) 26764 1726882717.44558: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882717.44571: getting variables 26764 1726882717.44572: in VariableManager get_vars() 26764 1726882717.44580: Calling all_inventory to load vars for managed_node2 26764 1726882717.44582: Calling groups_inventory to load vars for managed_node2 26764 1726882717.44584: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882717.44588: Calling all_plugins_play to load vars for managed_node2 26764 1726882717.44591: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882717.44593: Calling groups_plugins_play to load vars for managed_node2 26764 1726882717.44749: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882717.44941: done with get_vars() 26764 1726882717.44947: done getting variables 26764 1726882717.44995: in VariableManager get_vars() 26764 1726882717.45002: Calling all_inventory to load vars for managed_node2 26764 1726882717.45004: Calling groups_inventory to load vars for managed_node2 26764 1726882717.45006: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882717.45011: Calling all_plugins_play to load vars for managed_node2 26764 1726882717.45013: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882717.45016: Calling groups_plugins_play to load vars for managed_node2 26764 1726882717.45145: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882717.46032: done with get_vars() 26764 1726882717.46043: done queuing things up, now waiting for results queue to drain 26764 1726882717.46045: results queue empty 26764 1726882717.46046: checking for any_errors_fatal 26764 1726882717.46047: done checking for any_errors_fatal 26764 1726882717.46048: checking for max_fail_percentage 26764 1726882717.46049: done checking for max_fail_percentage 26764 1726882717.46049: checking to see if all hosts have failed and the running result is not ok 26764 1726882717.46050: done checking to see if all hosts have failed 26764 1726882717.46051: getting the remaining hosts for this loop 26764 1726882717.46052: done getting the remaining hosts for this loop 26764 1726882717.46054: getting the next task for host managed_node2 26764 1726882717.46056: done getting next task for host managed_node2 26764 1726882717.46057: ^ task is: None 26764 1726882717.46058: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882717.46059: done queuing things up, now waiting for results queue to drain 26764 1726882717.46060: results queue empty 26764 1726882717.46061: checking for any_errors_fatal 26764 1726882717.46062: done checking for any_errors_fatal 26764 1726882717.46062: checking for max_fail_percentage 26764 1726882717.46067: done checking for max_fail_percentage 26764 1726882717.46067: checking to see if all hosts have failed and the running result is not ok 26764 1726882717.46068: done checking to see if all hosts have failed 26764 1726882717.46070: getting the next task for host managed_node2 26764 1726882717.46072: done getting next task for host managed_node2 26764 1726882717.46073: ^ task is: None 26764 1726882717.46074: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882717.46120: in VariableManager get_vars() 26764 1726882717.46139: done with get_vars() 26764 1726882717.46144: in VariableManager get_vars() 26764 1726882717.46156: done with get_vars() 26764 1726882717.46161: variable 'omit' from source: magic vars 26764 1726882717.46195: in VariableManager get_vars() 26764 1726882717.46207: done with get_vars() 26764 1726882717.46228: variable 'omit' from source: magic vars PLAY [Play for testing reapplying the connection] ****************************** 26764 1726882717.47103: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 26764 1726882717.47598: getting the remaining hosts for this loop 26764 1726882717.47600: done getting the remaining hosts for this loop 26764 1726882717.47602: getting the next task for host managed_node2 26764 1726882717.47604: done getting next task for host managed_node2 26764 1726882717.47606: ^ task is: TASK: Gathering Facts 26764 1726882717.47608: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882717.47609: getting variables 26764 1726882717.47610: in VariableManager get_vars() 26764 1726882717.47620: Calling all_inventory to load vars for managed_node2 26764 1726882717.47622: Calling groups_inventory to load vars for managed_node2 26764 1726882717.47624: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882717.47629: Calling all_plugins_play to load vars for managed_node2 26764 1726882717.47642: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882717.47645: Calling groups_plugins_play to load vars for managed_node2 26764 1726882717.47781: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882717.47955: done with get_vars() 26764 1726882717.48467: done getting variables 26764 1726882717.48505: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_reapply.yml:7 Friday 20 September 2024 21:38:37 -0400 (0:00:00.097) 0:00:03.426 ****** 26764 1726882717.48526: entering _queue_task() for managed_node2/gather_facts 26764 1726882717.48794: worker is 1 (out of 1 available) 26764 1726882717.48805: exiting _queue_task() for managed_node2/gather_facts 26764 1726882717.48818: done queuing things up, now waiting for results queue to drain 26764 1726882717.48820: waiting for pending results... 26764 1726882717.49549: running TaskExecutor() for managed_node2/TASK: Gathering Facts 26764 1726882717.49822: in run() - task 0e448fcc-3ce9-9875-c9a3-000000000093 26764 1726882717.49977: variable 'ansible_search_path' from source: unknown 26764 1726882717.50018: calling self._execute() 26764 1726882717.50096: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882717.50178: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882717.50195: variable 'omit' from source: magic vars 26764 1726882717.50943: variable 'ansible_distribution_major_version' from source: facts 26764 1726882717.51086: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882717.51098: variable 'omit' from source: magic vars 26764 1726882717.51127: variable 'omit' from source: magic vars 26764 1726882717.51167: variable 'omit' from source: magic vars 26764 1726882717.51221: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26764 1726882717.51324: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26764 1726882717.51418: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26764 1726882717.51441: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882717.51458: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882717.51537: variable 'inventory_hostname' from source: host vars for 'managed_node2' 26764 1726882717.51548: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882717.51555: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882717.51712: Set connection var ansible_shell_executable to /bin/sh 26764 1726882717.51839: Set connection var ansible_shell_type to sh 26764 1726882717.51855: Set connection var ansible_module_compression to ZIP_DEFLATED 26764 1726882717.51867: Set connection var ansible_timeout to 10 26764 1726882717.51879: Set connection var ansible_connection to ssh 26764 1726882717.51889: Set connection var ansible_pipelining to False 26764 1726882717.51914: variable 'ansible_shell_executable' from source: unknown 26764 1726882717.51922: variable 'ansible_connection' from source: unknown 26764 1726882717.51929: variable 'ansible_module_compression' from source: unknown 26764 1726882717.51946: variable 'ansible_shell_type' from source: unknown 26764 1726882717.52057: variable 'ansible_shell_executable' from source: unknown 26764 1726882717.52067: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882717.52077: variable 'ansible_pipelining' from source: unknown 26764 1726882717.52085: variable 'ansible_timeout' from source: unknown 26764 1726882717.52093: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882717.52454: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 26764 1726882717.52471: variable 'omit' from source: magic vars 26764 1726882717.52480: starting attempt loop 26764 1726882717.52490: running the handler 26764 1726882717.52507: variable 'ansible_facts' from source: unknown 26764 1726882717.52603: _low_level_execute_command(): starting 26764 1726882717.52615: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 26764 1726882717.55398: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26764 1726882717.55482: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882717.55573: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882717.56186: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882717.56350: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882717.56362: stderr chunk (state=3): >>>debug2: match not found <<< 26764 1726882717.56380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882717.56398: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882717.56436: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882717.56449: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26764 1726882717.56462: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882717.56478: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882717.56493: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882717.56506: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882717.56542: stderr chunk (state=3): >>>debug2: match found <<< 26764 1726882717.56556: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882717.56683: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882717.56768: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882717.56782: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882717.57082: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 26764 1726882717.59101: stdout chunk (state=3): >>>/root <<< 26764 1726882717.59195: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882717.59268: stderr chunk (state=3): >>><<< 26764 1726882717.59272: stdout chunk (state=3): >>><<< 26764 1726882717.59375: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 26764 1726882717.59378: _low_level_execute_command(): starting 26764 1726882717.59382: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882717.5929146-26918-106595545220817 `" && echo ansible-tmp-1726882717.5929146-26918-106595545220817="` echo /root/.ansible/tmp/ansible-tmp-1726882717.5929146-26918-106595545220817 `" ) && sleep 0' 26764 1726882717.60156: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26764 1726882717.60176: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882717.60193: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882717.60326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882717.60371: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882717.60384: stderr chunk (state=3): >>>debug2: match not found <<< 26764 1726882717.60400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882717.60419: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882717.60432: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882717.60443: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26764 1726882717.60454: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882717.60474: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882717.60490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882717.60503: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882717.60514: stderr chunk (state=3): >>>debug2: match found <<< 26764 1726882717.60530: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882717.60607: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882717.60662: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882717.60687: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882717.60836: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882717.63532: stdout chunk (state=3): >>>ansible-tmp-1726882717.5929146-26918-106595545220817=/root/.ansible/tmp/ansible-tmp-1726882717.5929146-26918-106595545220817 <<< 26764 1726882717.63775: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882717.63778: stdout chunk (state=3): >>><<< 26764 1726882717.63781: stderr chunk (state=3): >>><<< 26764 1726882717.64203: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882717.5929146-26918-106595545220817=/root/.ansible/tmp/ansible-tmp-1726882717.5929146-26918-106595545220817 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882717.64206: variable 'ansible_module_compression' from source: unknown 26764 1726882717.64208: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-26764trh16hvb/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 26764 1726882717.64210: variable 'ansible_facts' from source: unknown 26764 1726882717.64212: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882717.5929146-26918-106595545220817/AnsiballZ_setup.py 26764 1726882717.64270: Sending initial data 26764 1726882717.64274: Sent initial data (154 bytes) 26764 1726882717.65149: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26764 1726882717.65161: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882717.65179: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882717.65194: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882717.65228: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882717.65240: stderr chunk (state=3): >>>debug2: match not found <<< 26764 1726882717.65257: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882717.65278: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882717.65290: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882717.65301: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26764 1726882717.65313: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882717.65326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882717.65340: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882717.65352: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882717.65362: stderr chunk (state=3): >>>debug2: match found <<< 26764 1726882717.65382: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882717.65449: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882717.65487: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882717.65508: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882717.65646: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882717.68162: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 26764 1726882717.68257: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 26764 1726882717.68361: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-26764trh16hvb/tmpniiuvcu2 /root/.ansible/tmp/ansible-tmp-1726882717.5929146-26918-106595545220817/AnsiballZ_setup.py <<< 26764 1726882717.68456: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 26764 1726882717.70506: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882717.70649: stderr chunk (state=3): >>><<< 26764 1726882717.70652: stdout chunk (state=3): >>><<< 26764 1726882717.70683: done transferring module to remote 26764 1726882717.70694: _low_level_execute_command(): starting 26764 1726882717.70699: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882717.5929146-26918-106595545220817/ /root/.ansible/tmp/ansible-tmp-1726882717.5929146-26918-106595545220817/AnsiballZ_setup.py && sleep 0' 26764 1726882717.71987: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882717.71991: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882717.72047: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882717.72050: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882717.72053: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 26764 1726882717.72055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882717.72107: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882717.72111: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882717.72223: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882717.74677: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882717.74718: stderr chunk (state=3): >>><<< 26764 1726882717.74723: stdout chunk (state=3): >>><<< 26764 1726882717.74746: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882717.74749: _low_level_execute_command(): starting 26764 1726882717.74752: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882717.5929146-26918-106595545220817/AnsiballZ_setup.py && sleep 0' 26764 1726882717.75322: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26764 1726882717.75336: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882717.75350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882717.75371: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882717.75410: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882717.75421: stderr chunk (state=3): >>>debug2: match not found <<< 26764 1726882717.75433: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882717.75449: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882717.75461: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882717.75478: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26764 1726882717.75490: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882717.75502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882717.75516: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882717.75526: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882717.75536: stderr chunk (state=3): >>>debug2: match found <<< 26764 1726882717.75548: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882717.75629: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882717.75649: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882717.75674: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882717.75818: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882718.32718: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-158", "ansible_nodename": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "21e18164a0c64d0daed004bd8a1b67b7", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALEARW5ZJ51XTLSDuUsPojumVU0f1DmiQsXjMOap4QLlljOiysapjSUe6pZOyAdiI/KfARhDoOFvlC07kCLCcs7DDk8JxBZpsM0D55SdDlfwsB3FVgWNP+9by8G6kzbePHWdZyyWlAuavj4OAEwAjpWpP8/daus0ha4xywlVVoKjAAAAFQCbiW4bR+tgMvjrxC198dqI1mTbjQAAAIBzCzkJTtnGDKfOHq2dFI5cUEuaj1PgRot3wyaXENzUjZVnIFgXUmgKDCxO+EAtU6uAkBPQF4XNgiuaw5bavYpZxcJ4WIpM4ZDRoSkc7BBbJPRLZ45GfrHJwgqAmAZ3RSvVqeXE4WKQHLm43/eDHewgPqqqWe6QVuQH5SEe79yk3wAAAIEArG+AuupiAeoVJ9Lh36QMj4kRo5pTASh2eD5MqSOdy39UhsXbWBcj3JCIvNk/nwep/9neGyRZ5t5wT05dRX80vlgZJX65hrbepO+lqC3wlng+6GQ34D7TJKYnvEkR3neE0+06kx5R6IRWZf1YQV6fMQhx8AJ2JmvnLFicmYlkhQQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDND+RJCrYgIUzolo5fZ64Ey6cksefKDUWmGDjsqVTmuT3HrlDyUZOro4JAnUQBmiamXsJUFbrFdJAVpukD4yyowqCQLr0ZFuKNEzrt5CObrtWflOskKynO3kaoU0WhDkqIbwS2j/+NxBCxgDGqd/5Os3cOMv3eyjUElz6xoI4zsmGMfxVYmT+/SHBfoyxyqY8Hw2Ooq+H5L9OlYgV4hqu7kKPpM1THUJTjy47m6qvws5gztclLjPA1KIW2Dz6kKzUYspNJcoS2sK1xFvL7mBjpGAP7WhXVH2n5ySenQ24Z6mEj+tG2f11rjPpjCUjDzzciGCWiRDZWBLm/GGmQXJJ8zAYnw82yIUKqufLrr1wmcXICPMVj9pFjXSoBWe/yhX9E87w7YD5HWsUrgrLdSctdV4QYy+R5g9ERi7FjwbRsuZ04BihZs70+f/29hUzuc6MA87KVovGT0Uc7GVC7bx8NLt0bTBsbydlONVHVQuol/YEpQrQophDvmBfh+PgMDH8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOEITn1vyppR+Moe1UdR0WGPhUnQ/dwHNcN<<< 26764 1726882718.32758: stdout chunk (state=3): >>>i0OYy21LkBQ5jsxOPLvZ+C2MbRYlz2afs4nYYIV8E0AuK6aRks3w=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKEdFOHVk9tX1R+zEyLVdxS/U5QeeeFYWSnUmjpXlpt7", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "38", "second": "38", "epoch": "1726882718", "epoch_int": "1726882718", "date": "2024-09-20", "time": "21:38:38", "iso8601_micro": "2024-09-21T01:38:38.043531Z", "iso8601": "2024-09-21T01:38:38Z", "iso8601_basic": "20240920T213838043531", "iso8601_basic_short": "20240920T213838", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_loadavg": {"1m": 0.4, "5m": 0.4, "15m": 0.24}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2792, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 740, "free": 2792}, "nocache": {"free": 3255, "used": 277}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2e6858-9a88-b36a-7765-70992ab591a7", "ansible_product_uuid": "ec2e6858-9a88-b36a-7765-70992ab591a7", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 657, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264238559232, "block_size": 4096, "block_total": 65519355, "block_available": 64511367, "block_used": 1007988, "inode_total": 131071472, "inode_available": 130998693, "inode_used": 72779, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}<<< 26764 1726882718.32785: stdout chunk (state=3): >>>], "ansible_is_chroot": false, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_local": {}, "ansible_iscsi_iqn": "", "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 33528 10.31.11.158 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 33528 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_lsb": {}, "ansible_fips": false, "ansible_apparmor": {"status": "disabled"}, "ansible_fibre_channel_wwn": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fix<<< 26764 1726882718.33022: stdout chunk (state=3): >>>ed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:4f:68:7a:de:b1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.158", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::104f:68ff:fe7a:deb1", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.158", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:4f:68:7a:de:b1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.158"], "ansible_all_ipv6_addresses": ["fe80::104f:68ff:fe7a:deb1"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.158", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::104f:68ff:fe7a:deb1"]}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 26764 1726882718.35223: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 26764 1726882718.35226: stdout chunk (state=3): >>><<< 26764 1726882718.35229: stderr chunk (state=3): >>><<< 26764 1726882718.35478: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-158", "ansible_nodename": "ip-10-31-11-158.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "21e18164a0c64d0daed004bd8a1b67b7", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALEARW5ZJ51XTLSDuUsPojumVU0f1DmiQsXjMOap4QLlljOiysapjSUe6pZOyAdiI/KfARhDoOFvlC07kCLCcs7DDk8JxBZpsM0D55SdDlfwsB3FVgWNP+9by8G6kzbePHWdZyyWlAuavj4OAEwAjpWpP8/daus0ha4xywlVVoKjAAAAFQCbiW4bR+tgMvjrxC198dqI1mTbjQAAAIBzCzkJTtnGDKfOHq2dFI5cUEuaj1PgRot3wyaXENzUjZVnIFgXUmgKDCxO+EAtU6uAkBPQF4XNgiuaw5bavYpZxcJ4WIpM4ZDRoSkc7BBbJPRLZ45GfrHJwgqAmAZ3RSvVqeXE4WKQHLm43/eDHewgPqqqWe6QVuQH5SEe79yk3wAAAIEArG+AuupiAeoVJ9Lh36QMj4kRo5pTASh2eD5MqSOdy39UhsXbWBcj3JCIvNk/nwep/9neGyRZ5t5wT05dRX80vlgZJX65hrbepO+lqC3wlng+6GQ34D7TJKYnvEkR3neE0+06kx5R6IRWZf1YQV6fMQhx8AJ2JmvnLFicmYlkhQQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDND+RJCrYgIUzolo5fZ64Ey6cksefKDUWmGDjsqVTmuT3HrlDyUZOro4JAnUQBmiamXsJUFbrFdJAVpukD4yyowqCQLr0ZFuKNEzrt5CObrtWflOskKynO3kaoU0WhDkqIbwS2j/+NxBCxgDGqd/5Os3cOMv3eyjUElz6xoI4zsmGMfxVYmT+/SHBfoyxyqY8Hw2Ooq+H5L9OlYgV4hqu7kKPpM1THUJTjy47m6qvws5gztclLjPA1KIW2Dz6kKzUYspNJcoS2sK1xFvL7mBjpGAP7WhXVH2n5ySenQ24Z6mEj+tG2f11rjPpjCUjDzzciGCWiRDZWBLm/GGmQXJJ8zAYnw82yIUKqufLrr1wmcXICPMVj9pFjXSoBWe/yhX9E87w7YD5HWsUrgrLdSctdV4QYy+R5g9ERi7FjwbRsuZ04BihZs70+f/29hUzuc6MA87KVovGT0Uc7GVC7bx8NLt0bTBsbydlONVHVQuol/YEpQrQophDvmBfh+PgMDH8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOEITn1vyppR+Moe1UdR0WGPhUnQ/dwHNcNi0OYy21LkBQ5jsxOPLvZ+C2MbRYlz2afs4nYYIV8E0AuK6aRks3w=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKEdFOHVk9tX1R+zEyLVdxS/U5QeeeFYWSnUmjpXlpt7", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "38", "second": "38", "epoch": "1726882718", "epoch_int": "1726882718", "date": "2024-09-20", "time": "21:38:38", "iso8601_micro": "2024-09-21T01:38:38.043531Z", "iso8601": "2024-09-21T01:38:38Z", "iso8601_basic": "20240920T213838043531", "iso8601_basic_short": "20240920T213838", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_loadavg": {"1m": 0.4, "5m": 0.4, "15m": 0.24}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2792, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 740, "free": 2792}, "nocache": {"free": 3255, "used": 277}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2e6858-9a88-b36a-7765-70992ab591a7", "ansible_product_uuid": "ec2e6858-9a88-b36a-7765-70992ab591a7", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 657, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264238559232, "block_size": 4096, "block_total": 65519355, "block_available": 64511367, "block_used": 1007988, "inode_total": 131071472, "inode_available": 130998693, "inode_used": 72779, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_is_chroot": false, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_local": {}, "ansible_iscsi_iqn": "", "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 33528 10.31.11.158 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 33528 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_lsb": {}, "ansible_fips": false, "ansible_apparmor": {"status": "disabled"}, "ansible_fibre_channel_wwn": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:4f:68:7a:de:b1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.158", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::104f:68ff:fe7a:deb1", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.158", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:4f:68:7a:de:b1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.158"], "ansible_all_ipv6_addresses": ["fe80::104f:68ff:fe7a:deb1"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.158", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::104f:68ff:fe7a:deb1"]}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 26764 1726882718.35628: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882717.5929146-26918-106595545220817/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 26764 1726882718.35650: _low_level_execute_command(): starting 26764 1726882718.35658: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882717.5929146-26918-106595545220817/ > /dev/null 2>&1 && sleep 0' 26764 1726882718.37241: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882718.37245: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882718.37285: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 26764 1726882718.37289: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882718.37293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 26764 1726882718.37383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882718.37446: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882718.37515: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882718.37521: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882718.37656: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882718.40353: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882718.40357: stdout chunk (state=3): >>><<< 26764 1726882718.40369: stderr chunk (state=3): >>><<< 26764 1726882718.40574: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882718.40578: handler run complete 26764 1726882718.40580: variable 'ansible_facts' from source: unknown 26764 1726882718.40624: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882718.40951: variable 'ansible_facts' from source: unknown 26764 1726882718.41152: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882718.41403: attempt loop complete, returning result 26764 1726882718.41412: _execute() done 26764 1726882718.41417: dumping result to json 26764 1726882718.41454: done dumping result, returning 26764 1726882718.41481: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [0e448fcc-3ce9-9875-c9a3-000000000093] 26764 1726882718.41557: sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000093 ok: [managed_node2] 26764 1726882718.42267: no more pending results, returning what we have 26764 1726882718.42270: results queue empty 26764 1726882718.42271: checking for any_errors_fatal 26764 1726882718.42273: done checking for any_errors_fatal 26764 1726882718.42274: checking for max_fail_percentage 26764 1726882718.42276: done checking for max_fail_percentage 26764 1726882718.42277: checking to see if all hosts have failed and the running result is not ok 26764 1726882718.42278: done checking to see if all hosts have failed 26764 1726882718.42279: getting the remaining hosts for this loop 26764 1726882718.42280: done getting the remaining hosts for this loop 26764 1726882718.42284: getting the next task for host managed_node2 26764 1726882718.42291: done getting next task for host managed_node2 26764 1726882718.42293: ^ task is: TASK: Show test banner 26764 1726882718.42295: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882718.42298: getting variables 26764 1726882718.42300: in VariableManager get_vars() 26764 1726882718.42333: Calling all_inventory to load vars for managed_node2 26764 1726882718.42336: Calling groups_inventory to load vars for managed_node2 26764 1726882718.42338: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882718.42352: Calling all_plugins_play to load vars for managed_node2 26764 1726882718.42355: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882718.42358: Calling groups_plugins_play to load vars for managed_node2 26764 1726882718.42508: done sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000093 26764 1726882718.42512: WORKER PROCESS EXITING 26764 1726882718.42598: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882718.42795: done with get_vars() 26764 1726882718.42918: done getting variables 26764 1726882718.43015: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Show test banner] ******************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_reapply.yml:14 Friday 20 September 2024 21:38:38 -0400 (0:00:00.947) 0:00:04.374 ****** 26764 1726882718.43262: entering _queue_task() for managed_node2/debug 26764 1726882718.43268: Creating lock for debug 26764 1726882718.43980: worker is 1 (out of 1 available) 26764 1726882718.44215: exiting _queue_task() for managed_node2/debug 26764 1726882718.44227: done queuing things up, now waiting for results queue to drain 26764 1726882718.44228: waiting for pending results... 26764 1726882718.44657: running TaskExecutor() for managed_node2/TASK: Show test banner 26764 1726882718.44844: in run() - task 0e448fcc-3ce9-9875-c9a3-000000000026 26764 1726882718.44896: variable 'ansible_search_path' from source: unknown 26764 1726882718.44937: calling self._execute() 26764 1726882718.45061: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882718.45180: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882718.45195: variable 'omit' from source: magic vars 26764 1726882718.45897: variable 'ansible_distribution_major_version' from source: facts 26764 1726882718.45988: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882718.46000: variable 'omit' from source: magic vars 26764 1726882718.46027: variable 'omit' from source: magic vars 26764 1726882718.46117: variable 'omit' from source: magic vars 26764 1726882718.46161: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26764 1726882718.46332: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26764 1726882718.47180: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26764 1726882718.47202: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882718.47218: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882718.47251: variable 'inventory_hostname' from source: host vars for 'managed_node2' 26764 1726882718.47378: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882718.47386: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882718.47489: Set connection var ansible_shell_executable to /bin/sh 26764 1726882718.47577: Set connection var ansible_shell_type to sh 26764 1726882718.47593: Set connection var ansible_module_compression to ZIP_DEFLATED 26764 1726882718.47602: Set connection var ansible_timeout to 10 26764 1726882718.47787: Set connection var ansible_connection to ssh 26764 1726882718.47797: Set connection var ansible_pipelining to False 26764 1726882718.47822: variable 'ansible_shell_executable' from source: unknown 26764 1726882718.47876: variable 'ansible_connection' from source: unknown 26764 1726882718.47886: variable 'ansible_module_compression' from source: unknown 26764 1726882718.47893: variable 'ansible_shell_type' from source: unknown 26764 1726882718.47899: variable 'ansible_shell_executable' from source: unknown 26764 1726882718.47905: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882718.47912: variable 'ansible_pipelining' from source: unknown 26764 1726882718.47918: variable 'ansible_timeout' from source: unknown 26764 1726882718.47925: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882718.48266: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 26764 1726882718.48383: variable 'omit' from source: magic vars 26764 1726882718.48393: starting attempt loop 26764 1726882718.48400: running the handler 26764 1726882718.48447: handler run complete 26764 1726882718.48694: attempt loop complete, returning result 26764 1726882718.48779: _execute() done 26764 1726882718.48787: dumping result to json 26764 1726882718.48795: done dumping result, returning 26764 1726882718.48805: done running TaskExecutor() for managed_node2/TASK: Show test banner [0e448fcc-3ce9-9875-c9a3-000000000026] 26764 1726882718.48814: sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000026 ok: [managed_node2] => {} MSG: Inside states tests 26764 1726882718.48956: no more pending results, returning what we have 26764 1726882718.48959: results queue empty 26764 1726882718.48960: checking for any_errors_fatal 26764 1726882718.48978: done checking for any_errors_fatal 26764 1726882718.48979: checking for max_fail_percentage 26764 1726882718.48981: done checking for max_fail_percentage 26764 1726882718.48983: checking to see if all hosts have failed and the running result is not ok 26764 1726882718.48984: done checking to see if all hosts have failed 26764 1726882718.48985: getting the remaining hosts for this loop 26764 1726882718.48986: done getting the remaining hosts for this loop 26764 1726882718.48990: getting the next task for host managed_node2 26764 1726882718.48996: done getting next task for host managed_node2 26764 1726882718.48998: ^ task is: TASK: Include the task 'show_interfaces.yml' 26764 1726882718.49000: ^ state is: HOST STATE: block=1, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882718.49004: getting variables 26764 1726882718.49005: in VariableManager get_vars() 26764 1726882718.49040: Calling all_inventory to load vars for managed_node2 26764 1726882718.49042: Calling groups_inventory to load vars for managed_node2 26764 1726882718.49044: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882718.49056: Calling all_plugins_play to load vars for managed_node2 26764 1726882718.49058: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882718.49060: Calling groups_plugins_play to load vars for managed_node2 26764 1726882718.49217: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882718.49455: done with get_vars() 26764 1726882718.49468: done getting variables 26764 1726882718.49502: done sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000026 26764 1726882718.49506: WORKER PROCESS EXITING TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_reapply.yml:17 Friday 20 September 2024 21:38:38 -0400 (0:00:00.063) 0:00:04.437 ****** 26764 1726882718.49572: entering _queue_task() for managed_node2/include_tasks 26764 1726882718.50016: worker is 1 (out of 1 available) 26764 1726882718.50141: exiting _queue_task() for managed_node2/include_tasks 26764 1726882718.50154: done queuing things up, now waiting for results queue to drain 26764 1726882718.50156: waiting for pending results... 26764 1726882718.50873: running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' 26764 1726882718.51158: in run() - task 0e448fcc-3ce9-9875-c9a3-000000000027 26764 1726882718.51181: variable 'ansible_search_path' from source: unknown 26764 1726882718.51218: calling self._execute() 26764 1726882718.51298: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882718.51378: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882718.51392: variable 'omit' from source: magic vars 26764 1726882718.52125: variable 'ansible_distribution_major_version' from source: facts 26764 1726882718.52229: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882718.52240: _execute() done 26764 1726882718.52247: dumping result to json 26764 1726882718.52254: done dumping result, returning 26764 1726882718.52262: done running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' [0e448fcc-3ce9-9875-c9a3-000000000027] 26764 1726882718.52277: sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000027 26764 1726882718.52396: no more pending results, returning what we have 26764 1726882718.52401: in VariableManager get_vars() 26764 1726882718.52442: Calling all_inventory to load vars for managed_node2 26764 1726882718.52445: Calling groups_inventory to load vars for managed_node2 26764 1726882718.52447: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882718.52466: Calling all_plugins_play to load vars for managed_node2 26764 1726882718.52470: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882718.52474: Calling groups_plugins_play to load vars for managed_node2 26764 1726882718.52641: done sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000027 26764 1726882718.52645: WORKER PROCESS EXITING 26764 1726882718.52659: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882718.52852: done with get_vars() 26764 1726882718.52866: variable 'ansible_search_path' from source: unknown 26764 1726882718.52881: we have included files to process 26764 1726882718.52882: generating all_blocks data 26764 1726882718.52883: done generating all_blocks data 26764 1726882718.52884: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 26764 1726882718.52886: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 26764 1726882718.52888: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 26764 1726882718.53033: in VariableManager get_vars() 26764 1726882718.53051: done with get_vars() 26764 1726882718.53519: done processing included file 26764 1726882718.53521: iterating over new_blocks loaded from include file 26764 1726882718.53522: in VariableManager get_vars() 26764 1726882718.53537: done with get_vars() 26764 1726882718.53539: filtering new block on tags 26764 1726882718.53555: done filtering new block on tags 26764 1726882718.53558: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node2 26764 1726882718.53563: extending task lists for all hosts with included blocks 26764 1726882718.53857: done extending task lists 26764 1726882718.53859: done processing included files 26764 1726882718.53859: results queue empty 26764 1726882718.53860: checking for any_errors_fatal 26764 1726882718.54058: done checking for any_errors_fatal 26764 1726882718.54060: checking for max_fail_percentage 26764 1726882718.54062: done checking for max_fail_percentage 26764 1726882718.54062: checking to see if all hosts have failed and the running result is not ok 26764 1726882718.54067: done checking to see if all hosts have failed 26764 1726882718.54068: getting the remaining hosts for this loop 26764 1726882718.54069: done getting the remaining hosts for this loop 26764 1726882718.54072: getting the next task for host managed_node2 26764 1726882718.54076: done getting next task for host managed_node2 26764 1726882718.54078: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 26764 1726882718.54080: ^ state is: HOST STATE: block=1, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882718.54083: getting variables 26764 1726882718.54084: in VariableManager get_vars() 26764 1726882718.54095: Calling all_inventory to load vars for managed_node2 26764 1726882718.54097: Calling groups_inventory to load vars for managed_node2 26764 1726882718.54099: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882718.54103: Calling all_plugins_play to load vars for managed_node2 26764 1726882718.54106: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882718.54109: Calling groups_plugins_play to load vars for managed_node2 26764 1726882718.54724: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882718.55138: done with get_vars() 26764 1726882718.55375: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 21:38:38 -0400 (0:00:00.058) 0:00:04.496 ****** 26764 1726882718.55444: entering _queue_task() for managed_node2/include_tasks 26764 1726882718.56123: worker is 1 (out of 1 available) 26764 1726882718.56358: exiting _queue_task() for managed_node2/include_tasks 26764 1726882718.56374: done queuing things up, now waiting for results queue to drain 26764 1726882718.56375: waiting for pending results... 26764 1726882718.56808: running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' 26764 1726882718.57067: in run() - task 0e448fcc-3ce9-9875-c9a3-0000000000d9 26764 1726882718.57088: variable 'ansible_search_path' from source: unknown 26764 1726882718.57095: variable 'ansible_search_path' from source: unknown 26764 1726882718.57132: calling self._execute() 26764 1726882718.57240: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882718.57389: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882718.57403: variable 'omit' from source: magic vars 26764 1726882718.58093: variable 'ansible_distribution_major_version' from source: facts 26764 1726882718.58111: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882718.58121: _execute() done 26764 1726882718.58128: dumping result to json 26764 1726882718.58140: done dumping result, returning 26764 1726882718.58150: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' [0e448fcc-3ce9-9875-c9a3-0000000000d9] 26764 1726882718.58255: sending task result for task 0e448fcc-3ce9-9875-c9a3-0000000000d9 26764 1726882718.58382: no more pending results, returning what we have 26764 1726882718.58387: in VariableManager get_vars() 26764 1726882718.58427: Calling all_inventory to load vars for managed_node2 26764 1726882718.58430: Calling groups_inventory to load vars for managed_node2 26764 1726882718.58432: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882718.58447: Calling all_plugins_play to load vars for managed_node2 26764 1726882718.58449: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882718.58453: Calling groups_plugins_play to load vars for managed_node2 26764 1726882718.58633: done sending task result for task 0e448fcc-3ce9-9875-c9a3-0000000000d9 26764 1726882718.58637: WORKER PROCESS EXITING 26764 1726882718.58644: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882718.58842: done with get_vars() 26764 1726882718.58853: variable 'ansible_search_path' from source: unknown 26764 1726882718.58854: variable 'ansible_search_path' from source: unknown 26764 1726882718.58892: we have included files to process 26764 1726882718.58894: generating all_blocks data 26764 1726882718.58895: done generating all_blocks data 26764 1726882718.58896: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 26764 1726882718.58898: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 26764 1726882718.58900: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 26764 1726882718.59509: done processing included file 26764 1726882718.59511: iterating over new_blocks loaded from include file 26764 1726882718.59512: in VariableManager get_vars() 26764 1726882718.59526: done with get_vars() 26764 1726882718.59527: filtering new block on tags 26764 1726882718.59658: done filtering new block on tags 26764 1726882718.59662: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node2 26764 1726882718.59667: extending task lists for all hosts with included blocks 26764 1726882718.59884: done extending task lists 26764 1726882718.59885: done processing included files 26764 1726882718.59886: results queue empty 26764 1726882718.59887: checking for any_errors_fatal 26764 1726882718.59889: done checking for any_errors_fatal 26764 1726882718.59890: checking for max_fail_percentage 26764 1726882718.59891: done checking for max_fail_percentage 26764 1726882718.59892: checking to see if all hosts have failed and the running result is not ok 26764 1726882718.59892: done checking to see if all hosts have failed 26764 1726882718.59893: getting the remaining hosts for this loop 26764 1726882718.59894: done getting the remaining hosts for this loop 26764 1726882718.59897: getting the next task for host managed_node2 26764 1726882718.59900: done getting next task for host managed_node2 26764 1726882718.59902: ^ task is: TASK: Gather current interface info 26764 1726882718.59905: ^ state is: HOST STATE: block=1, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882718.59907: getting variables 26764 1726882718.59908: in VariableManager get_vars() 26764 1726882718.59943: Calling all_inventory to load vars for managed_node2 26764 1726882718.59946: Calling groups_inventory to load vars for managed_node2 26764 1726882718.59948: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882718.59953: Calling all_plugins_play to load vars for managed_node2 26764 1726882718.59955: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882718.59958: Calling groups_plugins_play to load vars for managed_node2 26764 1726882718.60286: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882718.60658: done with get_vars() 26764 1726882718.60668: done getting variables 26764 1726882718.60705: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 21:38:38 -0400 (0:00:00.052) 0:00:04.548 ****** 26764 1726882718.60732: entering _queue_task() for managed_node2/command 26764 1726882718.61279: worker is 1 (out of 1 available) 26764 1726882718.61290: exiting _queue_task() for managed_node2/command 26764 1726882718.61303: done queuing things up, now waiting for results queue to drain 26764 1726882718.61303: waiting for pending results... 26764 1726882718.62163: running TaskExecutor() for managed_node2/TASK: Gather current interface info 26764 1726882718.62261: in run() - task 0e448fcc-3ce9-9875-c9a3-0000000000e8 26764 1726882718.62383: variable 'ansible_search_path' from source: unknown 26764 1726882718.62391: variable 'ansible_search_path' from source: unknown 26764 1726882718.62427: calling self._execute() 26764 1726882718.62503: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882718.62582: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882718.62597: variable 'omit' from source: magic vars 26764 1726882718.63352: variable 'ansible_distribution_major_version' from source: facts 26764 1726882718.63373: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882718.63386: variable 'omit' from source: magic vars 26764 1726882718.63433: variable 'omit' from source: magic vars 26764 1726882718.63479: variable 'omit' from source: magic vars 26764 1726882718.63600: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26764 1726882718.63639: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26764 1726882718.63792: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26764 1726882718.63816: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882718.63834: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882718.63869: variable 'inventory_hostname' from source: host vars for 'managed_node2' 26764 1726882718.63883: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882718.63891: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882718.64075: Set connection var ansible_shell_executable to /bin/sh 26764 1726882718.64176: Set connection var ansible_shell_type to sh 26764 1726882718.64192: Set connection var ansible_module_compression to ZIP_DEFLATED 26764 1726882718.64204: Set connection var ansible_timeout to 10 26764 1726882718.64214: Set connection var ansible_connection to ssh 26764 1726882718.64222: Set connection var ansible_pipelining to False 26764 1726882718.64247: variable 'ansible_shell_executable' from source: unknown 26764 1726882718.64254: variable 'ansible_connection' from source: unknown 26764 1726882718.64261: variable 'ansible_module_compression' from source: unknown 26764 1726882718.64270: variable 'ansible_shell_type' from source: unknown 26764 1726882718.64278: variable 'ansible_shell_executable' from source: unknown 26764 1726882718.64316: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882718.64324: variable 'ansible_pipelining' from source: unknown 26764 1726882718.64330: variable 'ansible_timeout' from source: unknown 26764 1726882718.64338: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882718.64593: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 26764 1726882718.64648: variable 'omit' from source: magic vars 26764 1726882718.64658: starting attempt loop 26764 1726882718.64749: running the handler 26764 1726882718.64773: _low_level_execute_command(): starting 26764 1726882718.64787: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 26764 1726882718.66625: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26764 1726882718.66639: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882718.66653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882718.66674: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882718.66794: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882718.66809: stderr chunk (state=3): >>>debug2: match not found <<< 26764 1726882718.66822: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882718.66840: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882718.66851: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882718.66861: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26764 1726882718.66876: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882718.66888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882718.66903: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882718.66918: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882718.66929: stderr chunk (state=3): >>>debug2: match found <<< 26764 1726882718.66942: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882718.67018: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882718.67154: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882718.67173: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882718.67373: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882718.69651: stdout chunk (state=3): >>>/root <<< 26764 1726882718.69868: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882718.69871: stdout chunk (state=3): >>><<< 26764 1726882718.69875: stderr chunk (state=3): >>><<< 26764 1726882718.69980: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882718.69985: _low_level_execute_command(): starting 26764 1726882718.69988: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882718.6989594-26999-119629532463678 `" && echo ansible-tmp-1726882718.6989594-26999-119629532463678="` echo /root/.ansible/tmp/ansible-tmp-1726882718.6989594-26999-119629532463678 `" ) && sleep 0' 26764 1726882718.71399: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26764 1726882718.71416: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882718.71431: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882718.71543: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882718.71588: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882718.71601: stderr chunk (state=3): >>>debug2: match not found <<< 26764 1726882718.71628: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882718.71651: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882718.71662: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882718.71680: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26764 1726882718.71693: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882718.71705: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882718.71719: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882718.71731: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882718.71746: stderr chunk (state=3): >>>debug2: match found <<< 26764 1726882718.71759: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882718.71834: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882718.71987: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882718.72004: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882718.72140: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882718.75070: stdout chunk (state=3): >>>ansible-tmp-1726882718.6989594-26999-119629532463678=/root/.ansible/tmp/ansible-tmp-1726882718.6989594-26999-119629532463678 <<< 26764 1726882718.75283: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882718.75338: stderr chunk (state=3): >>><<< 26764 1726882718.75341: stdout chunk (state=3): >>><<< 26764 1726882718.75480: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882718.6989594-26999-119629532463678=/root/.ansible/tmp/ansible-tmp-1726882718.6989594-26999-119629532463678 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882718.75483: variable 'ansible_module_compression' from source: unknown 26764 1726882718.75485: ANSIBALLZ: Using generic lock for ansible.legacy.command 26764 1726882718.75487: ANSIBALLZ: Acquiring lock 26764 1726882718.75489: ANSIBALLZ: Lock acquired: 140693693673600 26764 1726882718.75491: ANSIBALLZ: Creating module 26764 1726882719.01020: ANSIBALLZ: Writing module into payload 26764 1726882719.01154: ANSIBALLZ: Writing module 26764 1726882719.01184: ANSIBALLZ: Renaming module 26764 1726882719.01195: ANSIBALLZ: Done creating module 26764 1726882719.01224: variable 'ansible_facts' from source: unknown 26764 1726882719.01298: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882718.6989594-26999-119629532463678/AnsiballZ_command.py 26764 1726882719.02393: Sending initial data 26764 1726882719.02396: Sent initial data (156 bytes) 26764 1726882719.04546: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26764 1726882719.04568: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882719.04586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882719.04606: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882719.04648: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882719.04662: stderr chunk (state=3): >>>debug2: match not found <<< 26764 1726882719.04688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882719.04708: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882719.04720: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882719.04730: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26764 1726882719.04742: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882719.04754: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882719.04775: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882719.04791: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882719.04803: stderr chunk (state=3): >>>debug2: match found <<< 26764 1726882719.04819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882719.04900: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882719.04928: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882719.04945: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882719.05090: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882719.07584: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 <<< 26764 1726882719.07591: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 26764 1726882719.07680: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 26764 1726882719.07789: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-26764trh16hvb/tmpst1w2hyn /root/.ansible/tmp/ansible-tmp-1726882718.6989594-26999-119629532463678/AnsiballZ_command.py <<< 26764 1726882719.07895: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 26764 1726882719.09398: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882719.09470: stderr chunk (state=3): >>><<< 26764 1726882719.09582: stdout chunk (state=3): >>><<< 26764 1726882719.09585: done transferring module to remote 26764 1726882719.09587: _low_level_execute_command(): starting 26764 1726882719.09590: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882718.6989594-26999-119629532463678/ /root/.ansible/tmp/ansible-tmp-1726882718.6989594-26999-119629532463678/AnsiballZ_command.py && sleep 0' 26764 1726882719.11015: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26764 1726882719.11028: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882719.11042: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882719.11059: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882719.11104: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882719.11280: stderr chunk (state=3): >>>debug2: match not found <<< 26764 1726882719.11295: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882719.11312: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882719.11325: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882719.11336: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26764 1726882719.11348: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882719.11360: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882719.11381: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882719.11393: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882719.11403: stderr chunk (state=3): >>>debug2: match found <<< 26764 1726882719.11415: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882719.11495: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882719.11516: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882719.11531: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882719.11667: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882719.14225: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882719.14228: stdout chunk (state=3): >>><<< 26764 1726882719.14230: stderr chunk (state=3): >>><<< 26764 1726882719.14270: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882719.14274: _low_level_execute_command(): starting 26764 1726882719.14277: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882718.6989594-26999-119629532463678/AnsiballZ_command.py && sleep 0' 26764 1726882719.15652: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26764 1726882719.15726: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882719.15743: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882719.15762: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882719.15808: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882719.15833: stderr chunk (state=3): >>>debug2: match not found <<< 26764 1726882719.15848: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882719.15869: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882719.15952: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882719.15969: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26764 1726882719.15983: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882719.15998: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882719.16013: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882719.16026: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882719.16038: stderr chunk (state=3): >>>debug2: match found <<< 26764 1726882719.16054: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882719.16135: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882719.16271: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882719.16296: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882719.16443: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882719.38356: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:38:39.376564", "end": "2024-09-20 21:38:39.381173", "delta": "0:00:00.004609", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 26764 1726882719.40087: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882719.40201: stderr chunk (state=3): >>>Shared connection to 10.31.11.158 closed. <<< 26764 1726882719.40205: stdout chunk (state=3): >>><<< 26764 1726882719.40208: stderr chunk (state=3): >>><<< 26764 1726882719.40272: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:38:39.376564", "end": "2024-09-20 21:38:39.381173", "delta": "0:00:00.004609", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 26764 1726882719.40283: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882718.6989594-26999-119629532463678/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 26764 1726882719.40371: _low_level_execute_command(): starting 26764 1726882719.40375: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882718.6989594-26999-119629532463678/ > /dev/null 2>&1 && sleep 0' 26764 1726882719.41822: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26764 1726882719.41943: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882719.41960: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882719.41983: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882719.42028: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882719.42087: stderr chunk (state=3): >>>debug2: match not found <<< 26764 1726882719.42102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882719.42120: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882719.42132: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882719.42145: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26764 1726882719.42159: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882719.42278: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882719.42296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882719.42308: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882719.42320: stderr chunk (state=3): >>>debug2: match found <<< 26764 1726882719.42334: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882719.42417: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882719.42439: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882719.42457: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882719.42610: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882719.45192: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882719.45235: stderr chunk (state=3): >>><<< 26764 1726882719.45238: stdout chunk (state=3): >>><<< 26764 1726882719.45270: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882719.45274: handler run complete 26764 1726882719.45473: Evaluated conditional (False): False 26764 1726882719.45477: attempt loop complete, returning result 26764 1726882719.45479: _execute() done 26764 1726882719.45481: dumping result to json 26764 1726882719.45483: done dumping result, returning 26764 1726882719.45485: done running TaskExecutor() for managed_node2/TASK: Gather current interface info [0e448fcc-3ce9-9875-c9a3-0000000000e8] 26764 1726882719.45487: sending task result for task 0e448fcc-3ce9-9875-c9a3-0000000000e8 26764 1726882719.45560: done sending task result for task 0e448fcc-3ce9-9875-c9a3-0000000000e8 26764 1726882719.45569: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.004609", "end": "2024-09-20 21:38:39.381173", "rc": 0, "start": "2024-09-20 21:38:39.376564" } STDOUT: bonding_masters eth0 lo 26764 1726882719.45640: no more pending results, returning what we have 26764 1726882719.45644: results queue empty 26764 1726882719.45645: checking for any_errors_fatal 26764 1726882719.45646: done checking for any_errors_fatal 26764 1726882719.45648: checking for max_fail_percentage 26764 1726882719.45649: done checking for max_fail_percentage 26764 1726882719.45651: checking to see if all hosts have failed and the running result is not ok 26764 1726882719.45652: done checking to see if all hosts have failed 26764 1726882719.45653: getting the remaining hosts for this loop 26764 1726882719.45654: done getting the remaining hosts for this loop 26764 1726882719.45657: getting the next task for host managed_node2 26764 1726882719.45665: done getting next task for host managed_node2 26764 1726882719.45668: ^ task is: TASK: Set current_interfaces 26764 1726882719.45671: ^ state is: HOST STATE: block=1, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882719.45674: getting variables 26764 1726882719.45676: in VariableManager get_vars() 26764 1726882719.45714: Calling all_inventory to load vars for managed_node2 26764 1726882719.45716: Calling groups_inventory to load vars for managed_node2 26764 1726882719.45719: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882719.45730: Calling all_plugins_play to load vars for managed_node2 26764 1726882719.45732: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882719.45735: Calling groups_plugins_play to load vars for managed_node2 26764 1726882719.45901: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882719.46139: done with get_vars() 26764 1726882719.46149: done getting variables 26764 1726882719.46218: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 21:38:39 -0400 (0:00:00.855) 0:00:05.404 ****** 26764 1726882719.46248: entering _queue_task() for managed_node2/set_fact 26764 1726882719.47288: worker is 1 (out of 1 available) 26764 1726882719.47401: exiting _queue_task() for managed_node2/set_fact 26764 1726882719.47415: done queuing things up, now waiting for results queue to drain 26764 1726882719.47416: waiting for pending results... 26764 1726882719.47955: running TaskExecutor() for managed_node2/TASK: Set current_interfaces 26764 1726882719.48063: in run() - task 0e448fcc-3ce9-9875-c9a3-0000000000e9 26764 1726882719.48089: variable 'ansible_search_path' from source: unknown 26764 1726882719.48097: variable 'ansible_search_path' from source: unknown 26764 1726882719.48137: calling self._execute() 26764 1726882719.48331: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882719.48344: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882719.48359: variable 'omit' from source: magic vars 26764 1726882719.49195: variable 'ansible_distribution_major_version' from source: facts 26764 1726882719.49213: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882719.49226: variable 'omit' from source: magic vars 26764 1726882719.49298: variable 'omit' from source: magic vars 26764 1726882719.49475: variable '_current_interfaces' from source: set_fact 26764 1726882719.49636: variable 'omit' from source: magic vars 26764 1726882719.49739: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26764 1726882719.49806: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26764 1726882719.49943: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26764 1726882719.49970: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882719.49988: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882719.50021: variable 'inventory_hostname' from source: host vars for 'managed_node2' 26764 1726882719.50035: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882719.50110: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882719.50243: Set connection var ansible_shell_executable to /bin/sh 26764 1726882719.50368: Set connection var ansible_shell_type to sh 26764 1726882719.50388: Set connection var ansible_module_compression to ZIP_DEFLATED 26764 1726882719.50400: Set connection var ansible_timeout to 10 26764 1726882719.50410: Set connection var ansible_connection to ssh 26764 1726882719.50419: Set connection var ansible_pipelining to False 26764 1726882719.50445: variable 'ansible_shell_executable' from source: unknown 26764 1726882719.50455: variable 'ansible_connection' from source: unknown 26764 1726882719.50476: variable 'ansible_module_compression' from source: unknown 26764 1726882719.50556: variable 'ansible_shell_type' from source: unknown 26764 1726882719.50569: variable 'ansible_shell_executable' from source: unknown 26764 1726882719.50580: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882719.50593: variable 'ansible_pipelining' from source: unknown 26764 1726882719.50600: variable 'ansible_timeout' from source: unknown 26764 1726882719.50609: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882719.50945: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 26764 1726882719.50961: variable 'omit' from source: magic vars 26764 1726882719.50976: starting attempt loop 26764 1726882719.50982: running the handler 26764 1726882719.50996: handler run complete 26764 1726882719.51010: attempt loop complete, returning result 26764 1726882719.51022: _execute() done 26764 1726882719.51030: dumping result to json 26764 1726882719.51038: done dumping result, returning 26764 1726882719.51049: done running TaskExecutor() for managed_node2/TASK: Set current_interfaces [0e448fcc-3ce9-9875-c9a3-0000000000e9] 26764 1726882719.51135: sending task result for task 0e448fcc-3ce9-9875-c9a3-0000000000e9 26764 1726882719.51233: done sending task result for task 0e448fcc-3ce9-9875-c9a3-0000000000e9 ok: [managed_node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 26764 1726882719.51296: no more pending results, returning what we have 26764 1726882719.51299: results queue empty 26764 1726882719.51300: checking for any_errors_fatal 26764 1726882719.51306: done checking for any_errors_fatal 26764 1726882719.51307: checking for max_fail_percentage 26764 1726882719.51309: done checking for max_fail_percentage 26764 1726882719.51310: checking to see if all hosts have failed and the running result is not ok 26764 1726882719.51310: done checking to see if all hosts have failed 26764 1726882719.51311: getting the remaining hosts for this loop 26764 1726882719.51312: done getting the remaining hosts for this loop 26764 1726882719.51315: getting the next task for host managed_node2 26764 1726882719.51325: done getting next task for host managed_node2 26764 1726882719.51327: ^ task is: TASK: Show current_interfaces 26764 1726882719.51330: ^ state is: HOST STATE: block=1, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882719.51335: getting variables 26764 1726882719.51337: in VariableManager get_vars() 26764 1726882719.51376: Calling all_inventory to load vars for managed_node2 26764 1726882719.51379: Calling groups_inventory to load vars for managed_node2 26764 1726882719.51381: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882719.51394: Calling all_plugins_play to load vars for managed_node2 26764 1726882719.51397: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882719.51401: Calling groups_plugins_play to load vars for managed_node2 26764 1726882719.51558: WORKER PROCESS EXITING 26764 1726882719.51574: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882719.51782: done with get_vars() 26764 1726882719.51793: done getting variables 26764 1726882719.51850: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 21:38:39 -0400 (0:00:00.056) 0:00:05.460 ****** 26764 1726882719.51882: entering _queue_task() for managed_node2/debug 26764 1726882719.52689: worker is 1 (out of 1 available) 26764 1726882719.52699: exiting _queue_task() for managed_node2/debug 26764 1726882719.52709: done queuing things up, now waiting for results queue to drain 26764 1726882719.52710: waiting for pending results... 26764 1726882719.53099: running TaskExecutor() for managed_node2/TASK: Show current_interfaces 26764 1726882719.53303: in run() - task 0e448fcc-3ce9-9875-c9a3-0000000000da 26764 1726882719.53385: variable 'ansible_search_path' from source: unknown 26764 1726882719.53436: variable 'ansible_search_path' from source: unknown 26764 1726882719.53478: calling self._execute() 26764 1726882719.53617: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882719.53679: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882719.53772: variable 'omit' from source: magic vars 26764 1726882719.54357: variable 'ansible_distribution_major_version' from source: facts 26764 1726882719.54485: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882719.54497: variable 'omit' from source: magic vars 26764 1726882719.54540: variable 'omit' from source: magic vars 26764 1726882719.54716: variable 'current_interfaces' from source: set_fact 26764 1726882719.54870: variable 'omit' from source: magic vars 26764 1726882719.54913: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26764 1726882719.55068: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26764 1726882719.55094: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26764 1726882719.55115: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882719.55131: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882719.55171: variable 'inventory_hostname' from source: host vars for 'managed_node2' 26764 1726882719.55181: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882719.55190: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882719.55408: Set connection var ansible_shell_executable to /bin/sh 26764 1726882719.55416: Set connection var ansible_shell_type to sh 26764 1726882719.55445: Set connection var ansible_module_compression to ZIP_DEFLATED 26764 1726882719.55480: Set connection var ansible_timeout to 10 26764 1726882719.55497: Set connection var ansible_connection to ssh 26764 1726882719.55535: Set connection var ansible_pipelining to False 26764 1726882719.55560: variable 'ansible_shell_executable' from source: unknown 26764 1726882719.55604: variable 'ansible_connection' from source: unknown 26764 1726882719.55614: variable 'ansible_module_compression' from source: unknown 26764 1726882719.55621: variable 'ansible_shell_type' from source: unknown 26764 1726882719.55646: variable 'ansible_shell_executable' from source: unknown 26764 1726882719.55653: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882719.55677: variable 'ansible_pipelining' from source: unknown 26764 1726882719.55685: variable 'ansible_timeout' from source: unknown 26764 1726882719.55713: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882719.55971: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 26764 1726882719.56045: variable 'omit' from source: magic vars 26764 1726882719.56078: starting attempt loop 26764 1726882719.56085: running the handler 26764 1726882719.56186: handler run complete 26764 1726882719.56230: attempt loop complete, returning result 26764 1726882719.56276: _execute() done 26764 1726882719.56284: dumping result to json 26764 1726882719.56291: done dumping result, returning 26764 1726882719.56371: done running TaskExecutor() for managed_node2/TASK: Show current_interfaces [0e448fcc-3ce9-9875-c9a3-0000000000da] 26764 1726882719.56382: sending task result for task 0e448fcc-3ce9-9875-c9a3-0000000000da ok: [managed_node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 26764 1726882719.56521: no more pending results, returning what we have 26764 1726882719.56524: results queue empty 26764 1726882719.56526: checking for any_errors_fatal 26764 1726882719.56530: done checking for any_errors_fatal 26764 1726882719.56531: checking for max_fail_percentage 26764 1726882719.56532: done checking for max_fail_percentage 26764 1726882719.56533: checking to see if all hosts have failed and the running result is not ok 26764 1726882719.56534: done checking to see if all hosts have failed 26764 1726882719.56535: getting the remaining hosts for this loop 26764 1726882719.56536: done getting the remaining hosts for this loop 26764 1726882719.56539: getting the next task for host managed_node2 26764 1726882719.56547: done getting next task for host managed_node2 26764 1726882719.56550: ^ task is: TASK: Include the task 'assert_device_absent.yml' 26764 1726882719.56552: ^ state is: HOST STATE: block=1, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882719.56555: getting variables 26764 1726882719.56557: in VariableManager get_vars() 26764 1726882719.56594: Calling all_inventory to load vars for managed_node2 26764 1726882719.56597: Calling groups_inventory to load vars for managed_node2 26764 1726882719.56600: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882719.56613: Calling all_plugins_play to load vars for managed_node2 26764 1726882719.56616: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882719.56620: Calling groups_plugins_play to load vars for managed_node2 26764 1726882719.56796: done sending task result for task 0e448fcc-3ce9-9875-c9a3-0000000000da 26764 1726882719.56800: WORKER PROCESS EXITING 26764 1726882719.56814: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882719.57770: done with get_vars() 26764 1726882719.57778: done getting variables TASK [Include the task 'assert_device_absent.yml'] ***************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_reapply.yml:19 Friday 20 September 2024 21:38:39 -0400 (0:00:00.059) 0:00:05.520 ****** 26764 1726882719.57849: entering _queue_task() for managed_node2/include_tasks 26764 1726882719.58687: worker is 1 (out of 1 available) 26764 1726882719.58699: exiting _queue_task() for managed_node2/include_tasks 26764 1726882719.58711: done queuing things up, now waiting for results queue to drain 26764 1726882719.58712: waiting for pending results... 26764 1726882719.59391: running TaskExecutor() for managed_node2/TASK: Include the task 'assert_device_absent.yml' 26764 1726882719.59481: in run() - task 0e448fcc-3ce9-9875-c9a3-000000000028 26764 1726882719.59586: variable 'ansible_search_path' from source: unknown 26764 1726882719.59806: calling self._execute() 26764 1726882719.59890: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882719.59902: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882719.59916: variable 'omit' from source: magic vars 26764 1726882719.60410: variable 'ansible_distribution_major_version' from source: facts 26764 1726882719.60428: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882719.60440: _execute() done 26764 1726882719.60449: dumping result to json 26764 1726882719.60458: done dumping result, returning 26764 1726882719.60473: done running TaskExecutor() for managed_node2/TASK: Include the task 'assert_device_absent.yml' [0e448fcc-3ce9-9875-c9a3-000000000028] 26764 1726882719.60489: sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000028 26764 1726882719.60593: done sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000028 26764 1726882719.60623: no more pending results, returning what we have 26764 1726882719.60630: in VariableManager get_vars() 26764 1726882719.60671: Calling all_inventory to load vars for managed_node2 26764 1726882719.60675: Calling groups_inventory to load vars for managed_node2 26764 1726882719.60677: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882719.60697: Calling all_plugins_play to load vars for managed_node2 26764 1726882719.60700: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882719.60705: Calling groups_plugins_play to load vars for managed_node2 26764 1726882719.60886: WORKER PROCESS EXITING 26764 1726882719.60900: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882719.61107: done with get_vars() 26764 1726882719.61115: variable 'ansible_search_path' from source: unknown 26764 1726882719.61127: we have included files to process 26764 1726882719.61128: generating all_blocks data 26764 1726882719.61129: done generating all_blocks data 26764 1726882719.61134: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 26764 1726882719.61135: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 26764 1726882719.61137: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 26764 1726882719.61294: in VariableManager get_vars() 26764 1726882719.61541: done with get_vars() 26764 1726882719.61879: done processing included file 26764 1726882719.61881: iterating over new_blocks loaded from include file 26764 1726882719.61883: in VariableManager get_vars() 26764 1726882719.61897: done with get_vars() 26764 1726882719.61898: filtering new block on tags 26764 1726882719.61916: done filtering new block on tags 26764 1726882719.61918: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed_node2 26764 1726882719.61923: extending task lists for all hosts with included blocks 26764 1726882719.62537: done extending task lists 26764 1726882719.62539: done processing included files 26764 1726882719.62540: results queue empty 26764 1726882719.62540: checking for any_errors_fatal 26764 1726882719.62543: done checking for any_errors_fatal 26764 1726882719.62544: checking for max_fail_percentage 26764 1726882719.62546: done checking for max_fail_percentage 26764 1726882719.62547: checking to see if all hosts have failed and the running result is not ok 26764 1726882719.62547: done checking to see if all hosts have failed 26764 1726882719.62548: getting the remaining hosts for this loop 26764 1726882719.62549: done getting the remaining hosts for this loop 26764 1726882719.62552: getting the next task for host managed_node2 26764 1726882719.62556: done getting next task for host managed_node2 26764 1726882719.62558: ^ task is: TASK: Include the task 'get_interface_stat.yml' 26764 1726882719.62560: ^ state is: HOST STATE: block=1, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882719.62563: getting variables 26764 1726882719.62566: in VariableManager get_vars() 26764 1726882719.62577: Calling all_inventory to load vars for managed_node2 26764 1726882719.62579: Calling groups_inventory to load vars for managed_node2 26764 1726882719.62581: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882719.62586: Calling all_plugins_play to load vars for managed_node2 26764 1726882719.62588: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882719.62591: Calling groups_plugins_play to load vars for managed_node2 26764 1726882719.63211: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882719.63865: done with get_vars() 26764 1726882719.63874: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Friday 20 September 2024 21:38:39 -0400 (0:00:00.062) 0:00:05.582 ****** 26764 1726882719.64122: entering _queue_task() for managed_node2/include_tasks 26764 1726882719.64998: worker is 1 (out of 1 available) 26764 1726882719.65019: exiting _queue_task() for managed_node2/include_tasks 26764 1726882719.65031: done queuing things up, now waiting for results queue to drain 26764 1726882719.65032: waiting for pending results... 26764 1726882719.65823: running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' 26764 1726882719.66027: in run() - task 0e448fcc-3ce9-9875-c9a3-000000000104 26764 1726882719.66039: variable 'ansible_search_path' from source: unknown 26764 1726882719.66043: variable 'ansible_search_path' from source: unknown 26764 1726882719.66092: calling self._execute() 26764 1726882719.66297: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882719.66302: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882719.66431: variable 'omit' from source: magic vars 26764 1726882719.67145: variable 'ansible_distribution_major_version' from source: facts 26764 1726882719.67158: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882719.67169: _execute() done 26764 1726882719.67172: dumping result to json 26764 1726882719.67175: done dumping result, returning 26764 1726882719.67178: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' [0e448fcc-3ce9-9875-c9a3-000000000104] 26764 1726882719.67184: sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000104 26764 1726882719.67393: done sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000104 26764 1726882719.67396: WORKER PROCESS EXITING 26764 1726882719.67441: no more pending results, returning what we have 26764 1726882719.67446: in VariableManager get_vars() 26764 1726882719.67495: Calling all_inventory to load vars for managed_node2 26764 1726882719.67498: Calling groups_inventory to load vars for managed_node2 26764 1726882719.67500: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882719.67520: Calling all_plugins_play to load vars for managed_node2 26764 1726882719.67524: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882719.67527: Calling groups_plugins_play to load vars for managed_node2 26764 1726882719.67732: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882719.67957: done with get_vars() 26764 1726882719.67967: variable 'ansible_search_path' from source: unknown 26764 1726882719.67968: variable 'ansible_search_path' from source: unknown 26764 1726882719.68013: we have included files to process 26764 1726882719.68014: generating all_blocks data 26764 1726882719.68017: done generating all_blocks data 26764 1726882719.68018: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 26764 1726882719.68019: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 26764 1726882719.68021: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 26764 1726882719.68785: done processing included file 26764 1726882719.68787: iterating over new_blocks loaded from include file 26764 1726882719.68788: in VariableManager get_vars() 26764 1726882719.68802: done with get_vars() 26764 1726882719.68804: filtering new block on tags 26764 1726882719.68817: done filtering new block on tags 26764 1726882719.68818: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node2 26764 1726882719.68822: extending task lists for all hosts with included blocks 26764 1726882719.68934: done extending task lists 26764 1726882719.68936: done processing included files 26764 1726882719.68936: results queue empty 26764 1726882719.68937: checking for any_errors_fatal 26764 1726882719.68940: done checking for any_errors_fatal 26764 1726882719.68941: checking for max_fail_percentage 26764 1726882719.68942: done checking for max_fail_percentage 26764 1726882719.68943: checking to see if all hosts have failed and the running result is not ok 26764 1726882719.68943: done checking to see if all hosts have failed 26764 1726882719.68944: getting the remaining hosts for this loop 26764 1726882719.68945: done getting the remaining hosts for this loop 26764 1726882719.68948: getting the next task for host managed_node2 26764 1726882719.68951: done getting next task for host managed_node2 26764 1726882719.68953: ^ task is: TASK: Get stat for interface {{ interface }} 26764 1726882719.68956: ^ state is: HOST STATE: block=1, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882719.68958: getting variables 26764 1726882719.68959: in VariableManager get_vars() 26764 1726882719.68973: Calling all_inventory to load vars for managed_node2 26764 1726882719.69201: Calling groups_inventory to load vars for managed_node2 26764 1726882719.69204: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882719.69209: Calling all_plugins_play to load vars for managed_node2 26764 1726882719.69212: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882719.69215: Calling groups_plugins_play to load vars for managed_node2 26764 1726882719.69502: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882719.69920: done with get_vars() 26764 1726882719.69929: done getting variables 26764 1726882719.70326: variable 'interface' from source: play vars TASK [Get stat for interface rpltstbr] ***************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:38:39 -0400 (0:00:00.062) 0:00:05.645 ****** 26764 1726882719.70370: entering _queue_task() for managed_node2/stat 26764 1726882719.70966: worker is 1 (out of 1 available) 26764 1726882719.70977: exiting _queue_task() for managed_node2/stat 26764 1726882719.70987: done queuing things up, now waiting for results queue to drain 26764 1726882719.70988: waiting for pending results... 26764 1726882719.71850: running TaskExecutor() for managed_node2/TASK: Get stat for interface rpltstbr 26764 1726882719.72069: in run() - task 0e448fcc-3ce9-9875-c9a3-00000000011e 26764 1726882719.72080: variable 'ansible_search_path' from source: unknown 26764 1726882719.72084: variable 'ansible_search_path' from source: unknown 26764 1726882719.72116: calling self._execute() 26764 1726882719.72313: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882719.72319: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882719.72330: variable 'omit' from source: magic vars 26764 1726882719.73173: variable 'ansible_distribution_major_version' from source: facts 26764 1726882719.73185: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882719.73192: variable 'omit' from source: magic vars 26764 1726882719.73368: variable 'omit' from source: magic vars 26764 1726882719.73583: variable 'interface' from source: play vars 26764 1726882719.73601: variable 'omit' from source: magic vars 26764 1726882719.73644: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26764 1726882719.73808: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26764 1726882719.73826: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26764 1726882719.73843: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882719.73855: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882719.73982: variable 'inventory_hostname' from source: host vars for 'managed_node2' 26764 1726882719.73985: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882719.73993: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882719.74227: Set connection var ansible_shell_executable to /bin/sh 26764 1726882719.74231: Set connection var ansible_shell_type to sh 26764 1726882719.74243: Set connection var ansible_module_compression to ZIP_DEFLATED 26764 1726882719.74248: Set connection var ansible_timeout to 10 26764 1726882719.74253: Set connection var ansible_connection to ssh 26764 1726882719.74258: Set connection var ansible_pipelining to False 26764 1726882719.74282: variable 'ansible_shell_executable' from source: unknown 26764 1726882719.74285: variable 'ansible_connection' from source: unknown 26764 1726882719.74288: variable 'ansible_module_compression' from source: unknown 26764 1726882719.74290: variable 'ansible_shell_type' from source: unknown 26764 1726882719.74293: variable 'ansible_shell_executable' from source: unknown 26764 1726882719.74295: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882719.74297: variable 'ansible_pipelining' from source: unknown 26764 1726882719.74299: variable 'ansible_timeout' from source: unknown 26764 1726882719.74304: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882719.74743: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 26764 1726882719.74872: variable 'omit' from source: magic vars 26764 1726882719.74884: starting attempt loop 26764 1726882719.74887: running the handler 26764 1726882719.74899: _low_level_execute_command(): starting 26764 1726882719.74907: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 26764 1726882719.76884: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26764 1726882719.76896: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882719.76907: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882719.76930: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882719.76978: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882719.77034: stderr chunk (state=3): >>>debug2: match not found <<< 26764 1726882719.77053: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882719.77070: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882719.77080: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882719.77143: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26764 1726882719.77153: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882719.77173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882719.77186: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882719.77194: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882719.77202: stderr chunk (state=3): >>>debug2: match found <<< 26764 1726882719.77211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882719.77375: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882719.77399: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882719.77409: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882719.77611: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882719.79882: stdout chunk (state=3): >>>/root <<< 26764 1726882719.80084: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882719.80090: stdout chunk (state=3): >>><<< 26764 1726882719.80099: stderr chunk (state=3): >>><<< 26764 1726882719.80129: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882719.80141: _low_level_execute_command(): starting 26764 1726882719.80149: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882719.8012726-27034-110356199425404 `" && echo ansible-tmp-1726882719.8012726-27034-110356199425404="` echo /root/.ansible/tmp/ansible-tmp-1726882719.8012726-27034-110356199425404 `" ) && sleep 0' 26764 1726882719.81362: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882719.81415: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882719.81418: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882719.81432: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882719.81562: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882719.84206: stdout chunk (state=3): >>>ansible-tmp-1726882719.8012726-27034-110356199425404=/root/.ansible/tmp/ansible-tmp-1726882719.8012726-27034-110356199425404 <<< 26764 1726882719.84444: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882719.84447: stderr chunk (state=3): >>><<< 26764 1726882719.84450: stdout chunk (state=3): >>><<< 26764 1726882719.84470: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882719.8012726-27034-110356199425404=/root/.ansible/tmp/ansible-tmp-1726882719.8012726-27034-110356199425404 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882719.84515: variable 'ansible_module_compression' from source: unknown 26764 1726882719.84576: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-26764trh16hvb/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 26764 1726882719.84613: variable 'ansible_facts' from source: unknown 26764 1726882719.84701: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882719.8012726-27034-110356199425404/AnsiballZ_stat.py 26764 1726882719.84835: Sending initial data 26764 1726882719.84839: Sent initial data (153 bytes) 26764 1726882719.86733: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26764 1726882719.86815: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882719.86819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882719.86821: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882719.86834: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882719.86837: stderr chunk (state=3): >>>debug2: match not found <<< 26764 1726882719.86842: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882719.86854: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882719.86861: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882719.86872: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26764 1726882719.86879: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882719.86888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882719.86898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882719.86905: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882719.86914: stderr chunk (state=3): >>>debug2: match found <<< 26764 1726882719.86925: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882719.87002: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882719.87015: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882719.87021: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882719.87384: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882719.89620: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 26764 1726882719.89719: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 26764 1726882719.89842: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-26764trh16hvb/tmpkrr5ns2u /root/.ansible/tmp/ansible-tmp-1726882719.8012726-27034-110356199425404/AnsiballZ_stat.py <<< 26764 1726882719.89928: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 26764 1726882719.91492: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882719.91595: stderr chunk (state=3): >>><<< 26764 1726882719.91600: stdout chunk (state=3): >>><<< 26764 1726882719.91614: done transferring module to remote 26764 1726882719.91627: _low_level_execute_command(): starting 26764 1726882719.91632: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882719.8012726-27034-110356199425404/ /root/.ansible/tmp/ansible-tmp-1726882719.8012726-27034-110356199425404/AnsiballZ_stat.py && sleep 0' 26764 1726882719.94944: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26764 1726882719.94947: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882719.94949: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882719.94951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882719.94953: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882719.94959: stderr chunk (state=3): >>>debug2: match not found <<< 26764 1726882719.94960: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882719.94962: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882719.94987: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882719.94990: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26764 1726882719.94992: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882719.94994: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882719.94996: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882719.94998: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882719.95000: stderr chunk (state=3): >>>debug2: match found <<< 26764 1726882719.95002: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882719.95004: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882719.95452: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882719.95455: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882719.95457: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882719.97784: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882719.97787: stdout chunk (state=3): >>><<< 26764 1726882719.97789: stderr chunk (state=3): >>><<< 26764 1726882719.97792: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882719.97794: _low_level_execute_command(): starting 26764 1726882719.97871: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882719.8012726-27034-110356199425404/AnsiballZ_stat.py && sleep 0' 26764 1726882719.99098: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26764 1726882719.99112: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882719.99125: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882719.99141: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882719.99193: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882719.99205: stderr chunk (state=3): >>>debug2: match not found <<< 26764 1726882719.99220: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882719.99250: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882719.99282: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882719.99295: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26764 1726882719.99307: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882719.99319: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882719.99334: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882719.99345: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882719.99354: stderr chunk (state=3): >>>debug2: match found <<< 26764 1726882719.99371: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882719.99457: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882719.99479: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882719.99506: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882719.99655: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 26764 1726882720.16255: stdout chunk (state=3): >>> <<< 26764 1726882720.16259: stdout chunk (state=3): >>>{"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/rpltstbr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 26764 1726882720.18027: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882720.18031: stderr chunk (state=3): >>>Shared connection to 10.31.11.158 closed. <<< 26764 1726882720.18034: stdout chunk (state=3): >>><<< 26764 1726882720.18036: stderr chunk (state=3): >>><<< 26764 1726882720.18038: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/rpltstbr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 26764 1726882720.18041: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/rpltstbr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882719.8012726-27034-110356199425404/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 26764 1726882720.18047: _low_level_execute_command(): starting 26764 1726882720.18049: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882719.8012726-27034-110356199425404/ > /dev/null 2>&1 && sleep 0' 26764 1726882720.18730: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26764 1726882720.18754: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882720.18774: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882720.18798: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882720.18862: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882720.18881: stderr chunk (state=3): >>>debug2: match not found <<< 26764 1726882720.18897: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882720.18915: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882720.18929: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882720.18943: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26764 1726882720.18960: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882720.18987: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882720.19003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882720.19015: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882720.19026: stderr chunk (state=3): >>>debug2: match found <<< 26764 1726882720.19040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882720.19133: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882720.19149: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882720.19168: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882720.19448: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882720.21215: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882720.21285: stderr chunk (state=3): >>><<< 26764 1726882720.21288: stdout chunk (state=3): >>><<< 26764 1726882720.21376: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882720.21379: handler run complete 26764 1726882720.21381: attempt loop complete, returning result 26764 1726882720.21383: _execute() done 26764 1726882720.21385: dumping result to json 26764 1726882720.21388: done dumping result, returning 26764 1726882720.21390: done running TaskExecutor() for managed_node2/TASK: Get stat for interface rpltstbr [0e448fcc-3ce9-9875-c9a3-00000000011e] 26764 1726882720.21392: sending task result for task 0e448fcc-3ce9-9875-c9a3-00000000011e 26764 1726882720.21647: done sending task result for task 0e448fcc-3ce9-9875-c9a3-00000000011e 26764 1726882720.21650: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 26764 1726882720.21743: no more pending results, returning what we have 26764 1726882720.21746: results queue empty 26764 1726882720.21747: checking for any_errors_fatal 26764 1726882720.21749: done checking for any_errors_fatal 26764 1726882720.21750: checking for max_fail_percentage 26764 1726882720.21752: done checking for max_fail_percentage 26764 1726882720.21753: checking to see if all hosts have failed and the running result is not ok 26764 1726882720.21753: done checking to see if all hosts have failed 26764 1726882720.21755: getting the remaining hosts for this loop 26764 1726882720.21756: done getting the remaining hosts for this loop 26764 1726882720.21760: getting the next task for host managed_node2 26764 1726882720.21773: done getting next task for host managed_node2 26764 1726882720.21778: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 26764 1726882720.21781: ^ state is: HOST STATE: block=1, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882720.21786: getting variables 26764 1726882720.21789: in VariableManager get_vars() 26764 1726882720.21831: Calling all_inventory to load vars for managed_node2 26764 1726882720.21834: Calling groups_inventory to load vars for managed_node2 26764 1726882720.21837: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882720.21850: Calling all_plugins_play to load vars for managed_node2 26764 1726882720.21853: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882720.21856: Calling groups_plugins_play to load vars for managed_node2 26764 1726882720.22214: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882720.22560: done with get_vars() 26764 1726882720.22577: done getting variables 26764 1726882720.22704: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 26764 1726882720.22858: variable 'interface' from source: play vars TASK [Assert that the interface is absent - 'rpltstbr'] ************************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Friday 20 September 2024 21:38:40 -0400 (0:00:00.525) 0:00:06.170 ****** 26764 1726882720.22911: entering _queue_task() for managed_node2/assert 26764 1726882720.22913: Creating lock for assert 26764 1726882720.23436: worker is 1 (out of 1 available) 26764 1726882720.23454: exiting _queue_task() for managed_node2/assert 26764 1726882720.23514: done queuing things up, now waiting for results queue to drain 26764 1726882720.23515: waiting for pending results... 26764 1726882720.24457: running TaskExecutor() for managed_node2/TASK: Assert that the interface is absent - 'rpltstbr' 26764 1726882720.24573: in run() - task 0e448fcc-3ce9-9875-c9a3-000000000105 26764 1726882720.24587: variable 'ansible_search_path' from source: unknown 26764 1726882720.24591: variable 'ansible_search_path' from source: unknown 26764 1726882720.25755: calling self._execute() 26764 1726882720.25951: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882720.25958: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882720.26044: variable 'omit' from source: magic vars 26764 1726882720.26932: variable 'ansible_distribution_major_version' from source: facts 26764 1726882720.26944: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882720.26952: variable 'omit' from source: magic vars 26764 1726882720.26991: variable 'omit' from source: magic vars 26764 1726882720.27243: variable 'interface' from source: play vars 26764 1726882720.27286: variable 'omit' from source: magic vars 26764 1726882720.27387: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26764 1726882720.27541: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26764 1726882720.27561: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26764 1726882720.27655: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882720.27670: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882720.27741: variable 'inventory_hostname' from source: host vars for 'managed_node2' 26764 1726882720.27744: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882720.27747: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882720.28195: Set connection var ansible_shell_executable to /bin/sh 26764 1726882720.28198: Set connection var ansible_shell_type to sh 26764 1726882720.28208: Set connection var ansible_module_compression to ZIP_DEFLATED 26764 1726882720.28221: Set connection var ansible_timeout to 10 26764 1726882720.28229: Set connection var ansible_connection to ssh 26764 1726882720.28232: Set connection var ansible_pipelining to False 26764 1726882720.28314: variable 'ansible_shell_executable' from source: unknown 26764 1726882720.28318: variable 'ansible_connection' from source: unknown 26764 1726882720.28321: variable 'ansible_module_compression' from source: unknown 26764 1726882720.28324: variable 'ansible_shell_type' from source: unknown 26764 1726882720.28327: variable 'ansible_shell_executable' from source: unknown 26764 1726882720.28329: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882720.28333: variable 'ansible_pipelining' from source: unknown 26764 1726882720.28335: variable 'ansible_timeout' from source: unknown 26764 1726882720.28337: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882720.30187: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 26764 1726882720.30199: variable 'omit' from source: magic vars 26764 1726882720.30288: starting attempt loop 26764 1726882720.30291: running the handler 26764 1726882720.30607: variable 'interface_stat' from source: set_fact 26764 1726882720.30616: Evaluated conditional (not interface_stat.stat.exists): True 26764 1726882720.30620: handler run complete 26764 1726882720.30638: attempt loop complete, returning result 26764 1726882720.30760: _execute() done 26764 1726882720.30765: dumping result to json 26764 1726882720.30776: done dumping result, returning 26764 1726882720.30847: done running TaskExecutor() for managed_node2/TASK: Assert that the interface is absent - 'rpltstbr' [0e448fcc-3ce9-9875-c9a3-000000000105] 26764 1726882720.30850: sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000105 26764 1726882720.30938: done sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000105 26764 1726882720.30941: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 26764 1726882720.31009: no more pending results, returning what we have 26764 1726882720.31013: results queue empty 26764 1726882720.31014: checking for any_errors_fatal 26764 1726882720.31022: done checking for any_errors_fatal 26764 1726882720.31023: checking for max_fail_percentage 26764 1726882720.31024: done checking for max_fail_percentage 26764 1726882720.31025: checking to see if all hosts have failed and the running result is not ok 26764 1726882720.31026: done checking to see if all hosts have failed 26764 1726882720.31027: getting the remaining hosts for this loop 26764 1726882720.31028: done getting the remaining hosts for this loop 26764 1726882720.31031: getting the next task for host managed_node2 26764 1726882720.31042: done getting next task for host managed_node2 26764 1726882720.31045: ^ task is: TASK: meta (flush_handlers) 26764 1726882720.31047: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882720.31052: getting variables 26764 1726882720.31054: in VariableManager get_vars() 26764 1726882720.31094: Calling all_inventory to load vars for managed_node2 26764 1726882720.31097: Calling groups_inventory to load vars for managed_node2 26764 1726882720.31100: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882720.31120: Calling all_plugins_play to load vars for managed_node2 26764 1726882720.31128: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882720.31133: Calling groups_plugins_play to load vars for managed_node2 26764 1726882720.31337: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882720.31559: done with get_vars() 26764 1726882720.31574: done getting variables 26764 1726882720.31768: in VariableManager get_vars() 26764 1726882720.31781: Calling all_inventory to load vars for managed_node2 26764 1726882720.31783: Calling groups_inventory to load vars for managed_node2 26764 1726882720.31785: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882720.31790: Calling all_plugins_play to load vars for managed_node2 26764 1726882720.31792: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882720.31795: Calling groups_plugins_play to load vars for managed_node2 26764 1726882720.32116: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882720.32628: done with get_vars() 26764 1726882720.32641: done queuing things up, now waiting for results queue to drain 26764 1726882720.32643: results queue empty 26764 1726882720.32644: checking for any_errors_fatal 26764 1726882720.32646: done checking for any_errors_fatal 26764 1726882720.32647: checking for max_fail_percentage 26764 1726882720.32648: done checking for max_fail_percentage 26764 1726882720.32649: checking to see if all hosts have failed and the running result is not ok 26764 1726882720.32650: done checking to see if all hosts have failed 26764 1726882720.32655: getting the remaining hosts for this loop 26764 1726882720.32656: done getting the remaining hosts for this loop 26764 1726882720.32658: getting the next task for host managed_node2 26764 1726882720.32662: done getting next task for host managed_node2 26764 1726882720.32668: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 26764 1726882720.32670: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882720.32681: getting variables 26764 1726882720.32682: in VariableManager get_vars() 26764 1726882720.32696: Calling all_inventory to load vars for managed_node2 26764 1726882720.32698: Calling groups_inventory to load vars for managed_node2 26764 1726882720.32700: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882720.32818: Calling all_plugins_play to load vars for managed_node2 26764 1726882720.32828: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882720.32832: Calling groups_plugins_play to load vars for managed_node2 26764 1726882720.33090: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882720.33523: done with get_vars() 26764 1726882720.33531: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:38:40 -0400 (0:00:00.108) 0:00:06.278 ****** 26764 1726882720.33721: entering _queue_task() for managed_node2/include_tasks 26764 1726882720.34198: worker is 1 (out of 1 available) 26764 1726882720.34211: exiting _queue_task() for managed_node2/include_tasks 26764 1726882720.34222: done queuing things up, now waiting for results queue to drain 26764 1726882720.34223: waiting for pending results... 26764 1726882720.35004: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 26764 1726882720.35191: in run() - task 0e448fcc-3ce9-9875-c9a3-00000000000f 26764 1726882720.35205: variable 'ansible_search_path' from source: unknown 26764 1726882720.35209: variable 'ansible_search_path' from source: unknown 26764 1726882720.35304: calling self._execute() 26764 1726882720.35505: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882720.35509: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882720.35520: variable 'omit' from source: magic vars 26764 1726882720.36250: variable 'ansible_distribution_major_version' from source: facts 26764 1726882720.36262: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882720.36270: _execute() done 26764 1726882720.36392: dumping result to json 26764 1726882720.36395: done dumping result, returning 26764 1726882720.36403: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0e448fcc-3ce9-9875-c9a3-00000000000f] 26764 1726882720.36414: sending task result for task 0e448fcc-3ce9-9875-c9a3-00000000000f 26764 1726882720.36505: done sending task result for task 0e448fcc-3ce9-9875-c9a3-00000000000f 26764 1726882720.36508: WORKER PROCESS EXITING 26764 1726882720.36560: no more pending results, returning what we have 26764 1726882720.36572: in VariableManager get_vars() 26764 1726882720.36609: Calling all_inventory to load vars for managed_node2 26764 1726882720.36612: Calling groups_inventory to load vars for managed_node2 26764 1726882720.36614: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882720.36629: Calling all_plugins_play to load vars for managed_node2 26764 1726882720.36632: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882720.36636: Calling groups_plugins_play to load vars for managed_node2 26764 1726882720.36806: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882720.37017: done with get_vars() 26764 1726882720.37024: variable 'ansible_search_path' from source: unknown 26764 1726882720.37026: variable 'ansible_search_path' from source: unknown 26764 1726882720.37051: we have included files to process 26764 1726882720.37052: generating all_blocks data 26764 1726882720.37053: done generating all_blocks data 26764 1726882720.37058: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 26764 1726882720.37059: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 26764 1726882720.37062: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 26764 1726882720.38643: done processing included file 26764 1726882720.38645: iterating over new_blocks loaded from include file 26764 1726882720.38646: in VariableManager get_vars() 26764 1726882720.38669: done with get_vars() 26764 1726882720.38671: filtering new block on tags 26764 1726882720.38688: done filtering new block on tags 26764 1726882720.38691: in VariableManager get_vars() 26764 1726882720.38711: done with get_vars() 26764 1726882720.38713: filtering new block on tags 26764 1726882720.38852: done filtering new block on tags 26764 1726882720.38855: in VariableManager get_vars() 26764 1726882720.38879: done with get_vars() 26764 1726882720.38881: filtering new block on tags 26764 1726882720.38898: done filtering new block on tags 26764 1726882720.38900: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 26764 1726882720.38904: extending task lists for all hosts with included blocks 26764 1726882720.39983: done extending task lists 26764 1726882720.39985: done processing included files 26764 1726882720.39986: results queue empty 26764 1726882720.39987: checking for any_errors_fatal 26764 1726882720.39988: done checking for any_errors_fatal 26764 1726882720.39989: checking for max_fail_percentage 26764 1726882720.39990: done checking for max_fail_percentage 26764 1726882720.39991: checking to see if all hosts have failed and the running result is not ok 26764 1726882720.39992: done checking to see if all hosts have failed 26764 1726882720.39993: getting the remaining hosts for this loop 26764 1726882720.39994: done getting the remaining hosts for this loop 26764 1726882720.39996: getting the next task for host managed_node2 26764 1726882720.40000: done getting next task for host managed_node2 26764 1726882720.40003: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 26764 1726882720.40005: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882720.40015: getting variables 26764 1726882720.40016: in VariableManager get_vars() 26764 1726882720.40029: Calling all_inventory to load vars for managed_node2 26764 1726882720.40146: Calling groups_inventory to load vars for managed_node2 26764 1726882720.40149: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882720.40161: Calling all_plugins_play to load vars for managed_node2 26764 1726882720.40167: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882720.40171: Calling groups_plugins_play to load vars for managed_node2 26764 1726882720.40449: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882720.40878: done with get_vars() 26764 1726882720.40887: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:38:40 -0400 (0:00:00.074) 0:00:06.353 ****** 26764 1726882720.41146: entering _queue_task() for managed_node2/setup 26764 1726882720.41655: worker is 1 (out of 1 available) 26764 1726882720.41783: exiting _queue_task() for managed_node2/setup 26764 1726882720.41801: done queuing things up, now waiting for results queue to drain 26764 1726882720.41802: waiting for pending results... 26764 1726882720.42473: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 26764 1726882720.42678: in run() - task 0e448fcc-3ce9-9875-c9a3-000000000137 26764 1726882720.42692: variable 'ansible_search_path' from source: unknown 26764 1726882720.42695: variable 'ansible_search_path' from source: unknown 26764 1726882720.42726: calling self._execute() 26764 1726882720.42910: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882720.42916: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882720.42925: variable 'omit' from source: magic vars 26764 1726882720.43601: variable 'ansible_distribution_major_version' from source: facts 26764 1726882720.43614: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882720.44175: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 26764 1726882720.48848: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 26764 1726882720.48917: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 26764 1726882720.49070: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 26764 1726882720.49098: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 26764 1726882720.49124: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 26764 1726882720.49305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26764 1726882720.49334: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26764 1726882720.49358: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26764 1726882720.49513: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26764 1726882720.49527: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26764 1726882720.49575: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26764 1726882720.49699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26764 1726882720.49728: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26764 1726882720.49770: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26764 1726882720.49782: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26764 1726882720.50319: variable '__network_required_facts' from source: role '' defaults 26764 1726882720.50325: variable 'ansible_facts' from source: unknown 26764 1726882720.50483: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 26764 1726882720.50488: when evaluation is False, skipping this task 26764 1726882720.50491: _execute() done 26764 1726882720.50493: dumping result to json 26764 1726882720.50495: done dumping result, returning 26764 1726882720.50498: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0e448fcc-3ce9-9875-c9a3-000000000137] 26764 1726882720.50774: sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000137 26764 1726882720.50839: done sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000137 26764 1726882720.50842: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 26764 1726882720.50903: no more pending results, returning what we have 26764 1726882720.50906: results queue empty 26764 1726882720.50907: checking for any_errors_fatal 26764 1726882720.50908: done checking for any_errors_fatal 26764 1726882720.50909: checking for max_fail_percentage 26764 1726882720.50910: done checking for max_fail_percentage 26764 1726882720.50911: checking to see if all hosts have failed and the running result is not ok 26764 1726882720.50912: done checking to see if all hosts have failed 26764 1726882720.50913: getting the remaining hosts for this loop 26764 1726882720.50914: done getting the remaining hosts for this loop 26764 1726882720.50917: getting the next task for host managed_node2 26764 1726882720.50924: done getting next task for host managed_node2 26764 1726882720.50928: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 26764 1726882720.50931: ^ state is: HOST STATE: block=3, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882720.50943: getting variables 26764 1726882720.50945: in VariableManager get_vars() 26764 1726882720.50984: Calling all_inventory to load vars for managed_node2 26764 1726882720.50987: Calling groups_inventory to load vars for managed_node2 26764 1726882720.50990: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882720.50999: Calling all_plugins_play to load vars for managed_node2 26764 1726882720.51002: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882720.51005: Calling groups_plugins_play to load vars for managed_node2 26764 1726882720.51631: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882720.51855: done with get_vars() 26764 1726882720.51867: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:38:40 -0400 (0:00:00.108) 0:00:06.461 ****** 26764 1726882720.51973: entering _queue_task() for managed_node2/stat 26764 1726882720.52496: worker is 1 (out of 1 available) 26764 1726882720.52513: exiting _queue_task() for managed_node2/stat 26764 1726882720.52526: done queuing things up, now waiting for results queue to drain 26764 1726882720.52527: waiting for pending results... 26764 1726882720.53574: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 26764 1726882720.53712: in run() - task 0e448fcc-3ce9-9875-c9a3-000000000139 26764 1726882720.53730: variable 'ansible_search_path' from source: unknown 26764 1726882720.53739: variable 'ansible_search_path' from source: unknown 26764 1726882720.53781: calling self._execute() 26764 1726882720.53868: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882720.53881: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882720.53895: variable 'omit' from source: magic vars 26764 1726882720.54386: variable 'ansible_distribution_major_version' from source: facts 26764 1726882720.54404: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882720.54792: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 26764 1726882720.55518: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 26764 1726882720.55596: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 26764 1726882720.55657: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 26764 1726882720.55705: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 26764 1726882720.55806: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 26764 1726882720.55835: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 26764 1726882720.55878: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 26764 1726882720.55920: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 26764 1726882720.56018: variable '__network_is_ostree' from source: set_fact 26764 1726882720.56028: Evaluated conditional (not __network_is_ostree is defined): False 26764 1726882720.56035: when evaluation is False, skipping this task 26764 1726882720.56042: _execute() done 26764 1726882720.56048: dumping result to json 26764 1726882720.56058: done dumping result, returning 26764 1726882720.56074: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0e448fcc-3ce9-9875-c9a3-000000000139] 26764 1726882720.56110: sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000139 26764 1726882720.56219: done sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000139 skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 26764 1726882720.56270: no more pending results, returning what we have 26764 1726882720.56274: results queue empty 26764 1726882720.56275: checking for any_errors_fatal 26764 1726882720.56280: done checking for any_errors_fatal 26764 1726882720.56280: checking for max_fail_percentage 26764 1726882720.56282: done checking for max_fail_percentage 26764 1726882720.56283: checking to see if all hosts have failed and the running result is not ok 26764 1726882720.56284: done checking to see if all hosts have failed 26764 1726882720.56285: getting the remaining hosts for this loop 26764 1726882720.56286: done getting the remaining hosts for this loop 26764 1726882720.56289: getting the next task for host managed_node2 26764 1726882720.56295: done getting next task for host managed_node2 26764 1726882720.56299: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 26764 1726882720.56301: ^ state is: HOST STATE: block=3, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882720.56316: getting variables 26764 1726882720.56318: in VariableManager get_vars() 26764 1726882720.56354: Calling all_inventory to load vars for managed_node2 26764 1726882720.56356: Calling groups_inventory to load vars for managed_node2 26764 1726882720.56358: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882720.56371: Calling all_plugins_play to load vars for managed_node2 26764 1726882720.56374: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882720.56377: Calling groups_plugins_play to load vars for managed_node2 26764 1726882720.56611: WORKER PROCESS EXITING 26764 1726882720.56625: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882720.56831: done with get_vars() 26764 1726882720.56840: done getting variables 26764 1726882720.57010: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:38:40 -0400 (0:00:00.050) 0:00:06.512 ****** 26764 1726882720.57046: entering _queue_task() for managed_node2/set_fact 26764 1726882720.57636: worker is 1 (out of 1 available) 26764 1726882720.57649: exiting _queue_task() for managed_node2/set_fact 26764 1726882720.57733: done queuing things up, now waiting for results queue to drain 26764 1726882720.57734: waiting for pending results... 26764 1726882720.57873: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 26764 1726882720.58002: in run() - task 0e448fcc-3ce9-9875-c9a3-00000000013a 26764 1726882720.58022: variable 'ansible_search_path' from source: unknown 26764 1726882720.58030: variable 'ansible_search_path' from source: unknown 26764 1726882720.58078: calling self._execute() 26764 1726882720.58152: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882720.58166: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882720.58190: variable 'omit' from source: magic vars 26764 1726882720.58547: variable 'ansible_distribution_major_version' from source: facts 26764 1726882720.58567: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882720.58738: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 26764 1726882720.59011: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 26764 1726882720.59068: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 26764 1726882720.59109: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 26764 1726882720.59144: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 26764 1726882720.59236: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 26764 1726882720.59276: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 26764 1726882720.59305: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 26764 1726882720.59334: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 26764 1726882720.59429: variable '__network_is_ostree' from source: set_fact 26764 1726882720.59445: Evaluated conditional (not __network_is_ostree is defined): False 26764 1726882720.59454: when evaluation is False, skipping this task 26764 1726882720.59462: _execute() done 26764 1726882720.59485: dumping result to json 26764 1726882720.59497: done dumping result, returning 26764 1726882720.59508: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0e448fcc-3ce9-9875-c9a3-00000000013a] 26764 1726882720.59516: sending task result for task 0e448fcc-3ce9-9875-c9a3-00000000013a skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 26764 1726882720.59696: no more pending results, returning what we have 26764 1726882720.59699: results queue empty 26764 1726882720.59700: checking for any_errors_fatal 26764 1726882720.59706: done checking for any_errors_fatal 26764 1726882720.59708: checking for max_fail_percentage 26764 1726882720.59709: done checking for max_fail_percentage 26764 1726882720.59710: checking to see if all hosts have failed and the running result is not ok 26764 1726882720.59711: done checking to see if all hosts have failed 26764 1726882720.59712: getting the remaining hosts for this loop 26764 1726882720.59713: done getting the remaining hosts for this loop 26764 1726882720.59716: getting the next task for host managed_node2 26764 1726882720.59757: done getting next task for host managed_node2 26764 1726882720.59762: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 26764 1726882720.59766: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882720.59782: getting variables 26764 1726882720.59784: in VariableManager get_vars() 26764 1726882720.60448: Calling all_inventory to load vars for managed_node2 26764 1726882720.60451: Calling groups_inventory to load vars for managed_node2 26764 1726882720.60453: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882720.60462: Calling all_plugins_play to load vars for managed_node2 26764 1726882720.60890: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882720.60895: Calling groups_plugins_play to load vars for managed_node2 26764 1726882720.61700: done sending task result for task 0e448fcc-3ce9-9875-c9a3-00000000013a 26764 1726882720.61704: WORKER PROCESS EXITING 26764 1726882720.62005: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882720.62801: done with get_vars() 26764 1726882720.62811: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:38:40 -0400 (0:00:00.060) 0:00:06.572 ****** 26764 1726882720.63092: entering _queue_task() for managed_node2/service_facts 26764 1726882720.63206: Creating lock for service_facts 26764 1726882720.63908: worker is 1 (out of 1 available) 26764 1726882720.63920: exiting _queue_task() for managed_node2/service_facts 26764 1726882720.63931: done queuing things up, now waiting for results queue to drain 26764 1726882720.63932: waiting for pending results... 26764 1726882720.65223: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 26764 1726882720.65704: in run() - task 0e448fcc-3ce9-9875-c9a3-00000000013c 26764 1726882720.65724: variable 'ansible_search_path' from source: unknown 26764 1726882720.65728: variable 'ansible_search_path' from source: unknown 26764 1726882720.65768: calling self._execute() 26764 1726882720.65875: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882720.65882: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882720.65891: variable 'omit' from source: magic vars 26764 1726882720.66248: variable 'ansible_distribution_major_version' from source: facts 26764 1726882720.66306: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882720.66309: variable 'omit' from source: magic vars 26764 1726882720.66312: variable 'omit' from source: magic vars 26764 1726882720.66344: variable 'omit' from source: magic vars 26764 1726882720.66389: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26764 1726882720.66443: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26764 1726882720.66447: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26764 1726882720.66450: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882720.66468: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882720.66498: variable 'inventory_hostname' from source: host vars for 'managed_node2' 26764 1726882720.66506: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882720.66513: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882720.66623: Set connection var ansible_shell_executable to /bin/sh 26764 1726882720.66631: Set connection var ansible_shell_type to sh 26764 1726882720.66647: Set connection var ansible_module_compression to ZIP_DEFLATED 26764 1726882720.66656: Set connection var ansible_timeout to 10 26764 1726882720.66674: Set connection var ansible_connection to ssh 26764 1726882720.66690: Set connection var ansible_pipelining to False 26764 1726882720.66718: variable 'ansible_shell_executable' from source: unknown 26764 1726882720.66725: variable 'ansible_connection' from source: unknown 26764 1726882720.66733: variable 'ansible_module_compression' from source: unknown 26764 1726882720.66739: variable 'ansible_shell_type' from source: unknown 26764 1726882720.66745: variable 'ansible_shell_executable' from source: unknown 26764 1726882720.66752: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882720.66759: variable 'ansible_pipelining' from source: unknown 26764 1726882720.66770: variable 'ansible_timeout' from source: unknown 26764 1726882720.66783: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882720.67021: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 26764 1726882720.67052: variable 'omit' from source: magic vars 26764 1726882720.67055: starting attempt loop 26764 1726882720.67058: running the handler 26764 1726882720.67077: _low_level_execute_command(): starting 26764 1726882720.67085: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 26764 1726882720.68951: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882720.69030: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882720.69090: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882720.69109: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882720.69359: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882720.70896: stdout chunk (state=3): >>>/root <<< 26764 1726882720.71052: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882720.71055: stderr chunk (state=3): >>><<< 26764 1726882720.71058: stdout chunk (state=3): >>><<< 26764 1726882720.71081: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882720.71094: _low_level_execute_command(): starting 26764 1726882720.71100: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882720.7108014-27092-56048030599965 `" && echo ansible-tmp-1726882720.7108014-27092-56048030599965="` echo /root/.ansible/tmp/ansible-tmp-1726882720.7108014-27092-56048030599965 `" ) && sleep 0' 26764 1726882720.71781: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26764 1726882720.71784: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882720.71787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882720.71790: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882720.71874: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882720.71877: stderr chunk (state=3): >>>debug2: match not found <<< 26764 1726882720.71880: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882720.71889: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882720.71892: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882720.71895: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26764 1726882720.71897: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882720.71899: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882720.71901: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882720.71903: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882720.71905: stderr chunk (state=3): >>>debug2: match found <<< 26764 1726882720.71912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882720.71989: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882720.72004: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882720.72015: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882720.72140: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882720.74044: stdout chunk (state=3): >>>ansible-tmp-1726882720.7108014-27092-56048030599965=/root/.ansible/tmp/ansible-tmp-1726882720.7108014-27092-56048030599965 <<< 26764 1726882720.74213: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882720.74216: stdout chunk (state=3): >>><<< 26764 1726882720.74223: stderr chunk (state=3): >>><<< 26764 1726882720.74238: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882720.7108014-27092-56048030599965=/root/.ansible/tmp/ansible-tmp-1726882720.7108014-27092-56048030599965 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882720.74283: variable 'ansible_module_compression' from source: unknown 26764 1726882720.74323: ANSIBALLZ: Using lock for service_facts 26764 1726882720.74326: ANSIBALLZ: Acquiring lock 26764 1726882720.74328: ANSIBALLZ: Lock acquired: 140693691933008 26764 1726882720.74330: ANSIBALLZ: Creating module 26764 1726882720.86892: ANSIBALLZ: Writing module into payload 26764 1726882720.86972: ANSIBALLZ: Writing module 26764 1726882720.86990: ANSIBALLZ: Renaming module 26764 1726882720.87001: ANSIBALLZ: Done creating module 26764 1726882720.87016: variable 'ansible_facts' from source: unknown 26764 1726882720.87070: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882720.7108014-27092-56048030599965/AnsiballZ_service_facts.py 26764 1726882720.87169: Sending initial data 26764 1726882720.87173: Sent initial data (161 bytes) 26764 1726882720.87824: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882720.87827: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882720.87858: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882720.87861: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882720.87872: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882720.87925: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882720.87928: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882720.87930: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882720.88038: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882720.89880: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 <<< 26764 1726882720.89885: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 26764 1726882720.89984: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 26764 1726882720.90088: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-26764trh16hvb/tmph9tcfhxb /root/.ansible/tmp/ansible-tmp-1726882720.7108014-27092-56048030599965/AnsiballZ_service_facts.py <<< 26764 1726882720.90186: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 26764 1726882720.91391: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882720.91514: stderr chunk (state=3): >>><<< 26764 1726882720.91517: stdout chunk (state=3): >>><<< 26764 1726882720.91534: done transferring module to remote 26764 1726882720.91546: _low_level_execute_command(): starting 26764 1726882720.91551: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882720.7108014-27092-56048030599965/ /root/.ansible/tmp/ansible-tmp-1726882720.7108014-27092-56048030599965/AnsiballZ_service_facts.py && sleep 0' 26764 1726882720.92656: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 26764 1726882720.92665: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882720.92962: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882720.94663: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882720.94682: stderr chunk (state=3): >>><<< 26764 1726882720.94686: stdout chunk (state=3): >>><<< 26764 1726882720.94691: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882720.94694: _low_level_execute_command(): starting 26764 1726882720.94696: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882720.7108014-27092-56048030599965/AnsiballZ_service_facts.py && sleep 0' 26764 1726882720.95817: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26764 1726882720.95826: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882720.95837: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882720.95851: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882720.95891: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882720.95899: stderr chunk (state=3): >>>debug2: match not found <<< 26764 1726882720.95909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882720.95923: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882720.95929: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882720.95936: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26764 1726882720.95943: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882720.95952: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882720.95965: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882720.95975: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882720.95982: stderr chunk (state=3): >>>debug2: match found <<< 26764 1726882720.95991: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882720.96060: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882720.96078: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882720.96089: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882720.96214: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882722.30715: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 26764 1726882722.31985: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 26764 1726882722.32053: stderr chunk (state=3): >>><<< 26764 1726882722.32057: stdout chunk (state=3): >>><<< 26764 1726882722.32083: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 26764 1726882722.34735: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882720.7108014-27092-56048030599965/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 26764 1726882722.34743: _low_level_execute_command(): starting 26764 1726882722.34748: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882720.7108014-27092-56048030599965/ > /dev/null 2>&1 && sleep 0' 26764 1726882722.35428: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26764 1726882722.35437: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882722.35447: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882722.35462: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882722.35511: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882722.35517: stderr chunk (state=3): >>>debug2: match not found <<< 26764 1726882722.35527: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882722.35539: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882722.35546: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882722.35553: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26764 1726882722.35560: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882722.35574: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882722.35633: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882722.35640: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882722.35647: stderr chunk (state=3): >>>debug2: match found <<< 26764 1726882722.35656: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882722.35780: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882722.35849: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882722.35858: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882722.36086: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882722.37815: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882722.37862: stderr chunk (state=3): >>><<< 26764 1726882722.37870: stdout chunk (state=3): >>><<< 26764 1726882722.37881: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882722.37887: handler run complete 26764 1726882722.38001: variable 'ansible_facts' from source: unknown 26764 1726882722.38083: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882722.38329: variable 'ansible_facts' from source: unknown 26764 1726882722.38403: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882722.38509: attempt loop complete, returning result 26764 1726882722.38512: _execute() done 26764 1726882722.38515: dumping result to json 26764 1726882722.38571: done dumping result, returning 26764 1726882722.38574: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [0e448fcc-3ce9-9875-c9a3-00000000013c] 26764 1726882722.38577: sending task result for task 0e448fcc-3ce9-9875-c9a3-00000000013c ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 26764 1726882722.39301: no more pending results, returning what we have 26764 1726882722.39304: results queue empty 26764 1726882722.39305: checking for any_errors_fatal 26764 1726882722.39309: done checking for any_errors_fatal 26764 1726882722.39310: checking for max_fail_percentage 26764 1726882722.39311: done checking for max_fail_percentage 26764 1726882722.39312: checking to see if all hosts have failed and the running result is not ok 26764 1726882722.39313: done checking to see if all hosts have failed 26764 1726882722.39314: getting the remaining hosts for this loop 26764 1726882722.39315: done getting the remaining hosts for this loop 26764 1726882722.39318: getting the next task for host managed_node2 26764 1726882722.39325: done getting next task for host managed_node2 26764 1726882722.39328: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 26764 1726882722.39330: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882722.39339: getting variables 26764 1726882722.39340: in VariableManager get_vars() 26764 1726882722.39752: Calling all_inventory to load vars for managed_node2 26764 1726882722.39755: Calling groups_inventory to load vars for managed_node2 26764 1726882722.39758: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882722.39782: Calling all_plugins_play to load vars for managed_node2 26764 1726882722.39785: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882722.39789: Calling groups_plugins_play to load vars for managed_node2 26764 1726882722.40320: done sending task result for task 0e448fcc-3ce9-9875-c9a3-00000000013c 26764 1726882722.40329: WORKER PROCESS EXITING 26764 1726882722.40351: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882722.40859: done with get_vars() 26764 1726882722.40873: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:38:42 -0400 (0:00:01.778) 0:00:08.351 ****** 26764 1726882722.40940: entering _queue_task() for managed_node2/package_facts 26764 1726882722.40941: Creating lock for package_facts 26764 1726882722.41137: worker is 1 (out of 1 available) 26764 1726882722.41151: exiting _queue_task() for managed_node2/package_facts 26764 1726882722.41163: done queuing things up, now waiting for results queue to drain 26764 1726882722.41165: waiting for pending results... 26764 1726882722.41325: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 26764 1726882722.41399: in run() - task 0e448fcc-3ce9-9875-c9a3-00000000013d 26764 1726882722.41409: variable 'ansible_search_path' from source: unknown 26764 1726882722.41414: variable 'ansible_search_path' from source: unknown 26764 1726882722.41441: calling self._execute() 26764 1726882722.41503: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882722.41507: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882722.41516: variable 'omit' from source: magic vars 26764 1726882722.41780: variable 'ansible_distribution_major_version' from source: facts 26764 1726882722.41790: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882722.41796: variable 'omit' from source: magic vars 26764 1726882722.41831: variable 'omit' from source: magic vars 26764 1726882722.41856: variable 'omit' from source: magic vars 26764 1726882722.41891: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26764 1726882722.41916: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26764 1726882722.41930: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26764 1726882722.41944: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882722.41960: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882722.41986: variable 'inventory_hostname' from source: host vars for 'managed_node2' 26764 1726882722.41989: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882722.41992: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882722.42058: Set connection var ansible_shell_executable to /bin/sh 26764 1726882722.42061: Set connection var ansible_shell_type to sh 26764 1726882722.42072: Set connection var ansible_module_compression to ZIP_DEFLATED 26764 1726882722.42084: Set connection var ansible_timeout to 10 26764 1726882722.42087: Set connection var ansible_connection to ssh 26764 1726882722.42089: Set connection var ansible_pipelining to False 26764 1726882722.42103: variable 'ansible_shell_executable' from source: unknown 26764 1726882722.42106: variable 'ansible_connection' from source: unknown 26764 1726882722.42109: variable 'ansible_module_compression' from source: unknown 26764 1726882722.42112: variable 'ansible_shell_type' from source: unknown 26764 1726882722.42114: variable 'ansible_shell_executable' from source: unknown 26764 1726882722.42116: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882722.42118: variable 'ansible_pipelining' from source: unknown 26764 1726882722.42121: variable 'ansible_timeout' from source: unknown 26764 1726882722.42126: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882722.42261: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 26764 1726882722.42274: variable 'omit' from source: magic vars 26764 1726882722.42279: starting attempt loop 26764 1726882722.42282: running the handler 26764 1726882722.42301: _low_level_execute_command(): starting 26764 1726882722.42304: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 26764 1726882722.42797: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882722.42826: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882722.42936: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882722.42939: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882722.43060: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882722.44684: stdout chunk (state=3): >>>/root <<< 26764 1726882722.44851: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882722.44862: stdout chunk (state=3): >>><<< 26764 1726882722.44882: stderr chunk (state=3): >>><<< 26764 1726882722.44916: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882722.44939: _low_level_execute_command(): starting 26764 1726882722.44949: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882722.4492402-27192-86974012617798 `" && echo ansible-tmp-1726882722.4492402-27192-86974012617798="` echo /root/.ansible/tmp/ansible-tmp-1726882722.4492402-27192-86974012617798 `" ) && sleep 0' 26764 1726882722.45636: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26764 1726882722.45650: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882722.45683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882722.45702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882722.45744: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882722.45758: stderr chunk (state=3): >>>debug2: match not found <<< 26764 1726882722.45779: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882722.45814: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882722.45827: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882722.45840: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26764 1726882722.45853: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882722.45873: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882722.45893: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882722.45915: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882722.45926: stderr chunk (state=3): >>>debug2: match found <<< 26764 1726882722.45938: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882722.46027: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882722.46047: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882722.46061: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882722.46195: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882722.48058: stdout chunk (state=3): >>>ansible-tmp-1726882722.4492402-27192-86974012617798=/root/.ansible/tmp/ansible-tmp-1726882722.4492402-27192-86974012617798 <<< 26764 1726882722.48172: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882722.48253: stderr chunk (state=3): >>><<< 26764 1726882722.48269: stdout chunk (state=3): >>><<< 26764 1726882722.48478: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882722.4492402-27192-86974012617798=/root/.ansible/tmp/ansible-tmp-1726882722.4492402-27192-86974012617798 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882722.48482: variable 'ansible_module_compression' from source: unknown 26764 1726882722.48485: ANSIBALLZ: Using lock for package_facts 26764 1726882722.48487: ANSIBALLZ: Acquiring lock 26764 1726882722.48489: ANSIBALLZ: Lock acquired: 140693691927392 26764 1726882722.48491: ANSIBALLZ: Creating module 26764 1726882722.83045: ANSIBALLZ: Writing module into payload 26764 1726882722.83161: ANSIBALLZ: Writing module 26764 1726882722.83192: ANSIBALLZ: Renaming module 26764 1726882722.83195: ANSIBALLZ: Done creating module 26764 1726882722.83222: variable 'ansible_facts' from source: unknown 26764 1726882722.83356: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882722.4492402-27192-86974012617798/AnsiballZ_package_facts.py 26764 1726882722.83475: Sending initial data 26764 1726882722.83479: Sent initial data (161 bytes) 26764 1726882722.84155: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882722.84174: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882722.84195: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882722.84208: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 26764 1726882722.84218: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882722.84268: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882722.84284: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882722.84291: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882722.84409: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882722.86242: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 26764 1726882722.86335: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 26764 1726882722.86435: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-26764trh16hvb/tmp6mglqz9y /root/.ansible/tmp/ansible-tmp-1726882722.4492402-27192-86974012617798/AnsiballZ_package_facts.py <<< 26764 1726882722.86550: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 26764 1726882722.89027: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882722.89162: stderr chunk (state=3): >>><<< 26764 1726882722.89221: stdout chunk (state=3): >>><<< 26764 1726882722.89254: done transferring module to remote 26764 1726882722.89275: _low_level_execute_command(): starting 26764 1726882722.89304: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882722.4492402-27192-86974012617798/ /root/.ansible/tmp/ansible-tmp-1726882722.4492402-27192-86974012617798/AnsiballZ_package_facts.py && sleep 0' 26764 1726882722.89903: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882722.89943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882722.89956: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882722.89983: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882722.90015: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882722.90026: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882722.90156: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882722.91920: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882722.91971: stderr chunk (state=3): >>><<< 26764 1726882722.91974: stdout chunk (state=3): >>><<< 26764 1726882722.91987: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882722.91990: _low_level_execute_command(): starting 26764 1726882722.91994: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882722.4492402-27192-86974012617798/AnsiballZ_package_facts.py && sleep 0' 26764 1726882722.92428: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882722.92433: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882722.92467: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882722.92480: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882722.92534: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882722.92542: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882722.92662: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882723.38747: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}<<< 26764 1726882723.38825: stdout chunk (state=3): >>>], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "versi<<< 26764 1726882723.38834: stdout chunk (state=3): >>>on": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "versi<<< 26764 1726882723.38855: stdout chunk (state=3): >>>on": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"<<< 26764 1726882723.38873: stdout chunk (state=3): >>>}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "per<<< 26764 1726882723.38893: stdout chunk (state=3): >>>l-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "r<<< 26764 1726882723.38906: stdout chunk (state=3): >>>elease": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch"<<< 26764 1726882723.38916: stdout chunk (state=3): >>>: 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": <<< 26764 1726882723.38927: stdout chunk (state=3): >>>"python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 26764 1726882723.40381: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 26764 1726882723.40455: stderr chunk (state=3): >>><<< 26764 1726882723.40460: stdout chunk (state=3): >>><<< 26764 1726882723.40580: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 26764 1726882723.44186: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882722.4492402-27192-86974012617798/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 26764 1726882723.44217: _low_level_execute_command(): starting 26764 1726882723.44225: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882722.4492402-27192-86974012617798/ > /dev/null 2>&1 && sleep 0' 26764 1726882723.45192: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26764 1726882723.45224: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882723.45251: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882723.45288: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882723.45352: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882723.45367: stderr chunk (state=3): >>>debug2: match not found <<< 26764 1726882723.45384: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882723.45403: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882723.45414: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882723.45423: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26764 1726882723.45434: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882723.45445: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882723.45458: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882723.45470: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882723.45485: stderr chunk (state=3): >>>debug2: match found <<< 26764 1726882723.45497: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882723.45574: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882723.45599: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882723.45615: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882723.45740: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882723.47701: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882723.47704: stdout chunk (state=3): >>><<< 26764 1726882723.47707: stderr chunk (state=3): >>><<< 26764 1726882723.47771: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882723.47775: handler run complete 26764 1726882723.48783: variable 'ansible_facts' from source: unknown 26764 1726882723.49312: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882723.51639: variable 'ansible_facts' from source: unknown 26764 1726882723.52141: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882723.52971: attempt loop complete, returning result 26764 1726882723.52996: _execute() done 26764 1726882723.53004: dumping result to json 26764 1726882723.53233: done dumping result, returning 26764 1726882723.53248: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0e448fcc-3ce9-9875-c9a3-00000000013d] 26764 1726882723.53258: sending task result for task 0e448fcc-3ce9-9875-c9a3-00000000013d 26764 1726882723.55461: done sending task result for task 0e448fcc-3ce9-9875-c9a3-00000000013d 26764 1726882723.55466: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 26764 1726882723.55566: no more pending results, returning what we have 26764 1726882723.55570: results queue empty 26764 1726882723.55571: checking for any_errors_fatal 26764 1726882723.55575: done checking for any_errors_fatal 26764 1726882723.55576: checking for max_fail_percentage 26764 1726882723.55578: done checking for max_fail_percentage 26764 1726882723.55579: checking to see if all hosts have failed and the running result is not ok 26764 1726882723.55579: done checking to see if all hosts have failed 26764 1726882723.55580: getting the remaining hosts for this loop 26764 1726882723.55581: done getting the remaining hosts for this loop 26764 1726882723.55585: getting the next task for host managed_node2 26764 1726882723.55593: done getting next task for host managed_node2 26764 1726882723.55596: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 26764 1726882723.55598: ^ state is: HOST STATE: block=3, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882723.55606: getting variables 26764 1726882723.55608: in VariableManager get_vars() 26764 1726882723.55642: Calling all_inventory to load vars for managed_node2 26764 1726882723.55644: Calling groups_inventory to load vars for managed_node2 26764 1726882723.55646: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882723.55657: Calling all_plugins_play to load vars for managed_node2 26764 1726882723.55659: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882723.55661: Calling groups_plugins_play to load vars for managed_node2 26764 1726882723.57326: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882723.59089: done with get_vars() 26764 1726882723.59113: done getting variables 26764 1726882723.59188: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:38:43 -0400 (0:00:01.182) 0:00:09.533 ****** 26764 1726882723.59224: entering _queue_task() for managed_node2/debug 26764 1726882723.59536: worker is 1 (out of 1 available) 26764 1726882723.59548: exiting _queue_task() for managed_node2/debug 26764 1726882723.59565: done queuing things up, now waiting for results queue to drain 26764 1726882723.59566: waiting for pending results... 26764 1726882723.59853: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 26764 1726882723.59972: in run() - task 0e448fcc-3ce9-9875-c9a3-000000000010 26764 1726882723.59993: variable 'ansible_search_path' from source: unknown 26764 1726882723.60006: variable 'ansible_search_path' from source: unknown 26764 1726882723.60051: calling self._execute() 26764 1726882723.60146: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882723.60159: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882723.60177: variable 'omit' from source: magic vars 26764 1726882723.60571: variable 'ansible_distribution_major_version' from source: facts 26764 1726882723.60590: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882723.60602: variable 'omit' from source: magic vars 26764 1726882723.60640: variable 'omit' from source: magic vars 26764 1726882723.60750: variable 'network_provider' from source: set_fact 26764 1726882723.60783: variable 'omit' from source: magic vars 26764 1726882723.60828: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26764 1726882723.60874: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26764 1726882723.60903: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26764 1726882723.60926: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882723.60943: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882723.60982: variable 'inventory_hostname' from source: host vars for 'managed_node2' 26764 1726882723.60997: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882723.61005: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882723.61119: Set connection var ansible_shell_executable to /bin/sh 26764 1726882723.61127: Set connection var ansible_shell_type to sh 26764 1726882723.61143: Set connection var ansible_module_compression to ZIP_DEFLATED 26764 1726882723.61154: Set connection var ansible_timeout to 10 26764 1726882723.61169: Set connection var ansible_connection to ssh 26764 1726882723.61181: Set connection var ansible_pipelining to False 26764 1726882723.61219: variable 'ansible_shell_executable' from source: unknown 26764 1726882723.61228: variable 'ansible_connection' from source: unknown 26764 1726882723.61235: variable 'ansible_module_compression' from source: unknown 26764 1726882723.61242: variable 'ansible_shell_type' from source: unknown 26764 1726882723.61249: variable 'ansible_shell_executable' from source: unknown 26764 1726882723.61256: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882723.61266: variable 'ansible_pipelining' from source: unknown 26764 1726882723.61273: variable 'ansible_timeout' from source: unknown 26764 1726882723.61280: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882723.61417: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 26764 1726882723.61436: variable 'omit' from source: magic vars 26764 1726882723.61445: starting attempt loop 26764 1726882723.61450: running the handler 26764 1726882723.61495: handler run complete 26764 1726882723.61513: attempt loop complete, returning result 26764 1726882723.61523: _execute() done 26764 1726882723.61534: dumping result to json 26764 1726882723.61540: done dumping result, returning 26764 1726882723.61550: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [0e448fcc-3ce9-9875-c9a3-000000000010] 26764 1726882723.61558: sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000010 ok: [managed_node2] => {} MSG: Using network provider: nm 26764 1726882723.61709: no more pending results, returning what we have 26764 1726882723.61712: results queue empty 26764 1726882723.61713: checking for any_errors_fatal 26764 1726882723.61723: done checking for any_errors_fatal 26764 1726882723.61724: checking for max_fail_percentage 26764 1726882723.61725: done checking for max_fail_percentage 26764 1726882723.61726: checking to see if all hosts have failed and the running result is not ok 26764 1726882723.61727: done checking to see if all hosts have failed 26764 1726882723.61729: getting the remaining hosts for this loop 26764 1726882723.61730: done getting the remaining hosts for this loop 26764 1726882723.61733: getting the next task for host managed_node2 26764 1726882723.61740: done getting next task for host managed_node2 26764 1726882723.61744: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 26764 1726882723.61746: ^ state is: HOST STATE: block=3, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882723.61757: getting variables 26764 1726882723.61759: in VariableManager get_vars() 26764 1726882723.61796: Calling all_inventory to load vars for managed_node2 26764 1726882723.61799: Calling groups_inventory to load vars for managed_node2 26764 1726882723.61802: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882723.61812: Calling all_plugins_play to load vars for managed_node2 26764 1726882723.61815: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882723.61818: Calling groups_plugins_play to load vars for managed_node2 26764 1726882723.62822: done sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000010 26764 1726882723.62825: WORKER PROCESS EXITING 26764 1726882723.63550: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882723.65304: done with get_vars() 26764 1726882723.65340: done getting variables 26764 1726882723.65448: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:38:43 -0400 (0:00:00.062) 0:00:09.596 ****** 26764 1726882723.65486: entering _queue_task() for managed_node2/fail 26764 1726882723.65488: Creating lock for fail 26764 1726882723.65817: worker is 1 (out of 1 available) 26764 1726882723.65830: exiting _queue_task() for managed_node2/fail 26764 1726882723.65843: done queuing things up, now waiting for results queue to drain 26764 1726882723.65844: waiting for pending results... 26764 1726882723.66141: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 26764 1726882723.66268: in run() - task 0e448fcc-3ce9-9875-c9a3-000000000011 26764 1726882723.66293: variable 'ansible_search_path' from source: unknown 26764 1726882723.66301: variable 'ansible_search_path' from source: unknown 26764 1726882723.66351: calling self._execute() 26764 1726882723.66447: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882723.66460: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882723.66480: variable 'omit' from source: magic vars 26764 1726882723.66871: variable 'ansible_distribution_major_version' from source: facts 26764 1726882723.66894: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882723.67028: variable 'network_state' from source: role '' defaults 26764 1726882723.67045: Evaluated conditional (network_state != {}): False 26764 1726882723.67056: when evaluation is False, skipping this task 26764 1726882723.67066: _execute() done 26764 1726882723.67077: dumping result to json 26764 1726882723.67090: done dumping result, returning 26764 1726882723.67106: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0e448fcc-3ce9-9875-c9a3-000000000011] 26764 1726882723.67118: sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000011 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 26764 1726882723.67253: no more pending results, returning what we have 26764 1726882723.67257: results queue empty 26764 1726882723.67259: checking for any_errors_fatal 26764 1726882723.67266: done checking for any_errors_fatal 26764 1726882723.67273: checking for max_fail_percentage 26764 1726882723.67274: done checking for max_fail_percentage 26764 1726882723.67275: checking to see if all hosts have failed and the running result is not ok 26764 1726882723.67276: done checking to see if all hosts have failed 26764 1726882723.67277: getting the remaining hosts for this loop 26764 1726882723.67278: done getting the remaining hosts for this loop 26764 1726882723.67281: getting the next task for host managed_node2 26764 1726882723.67289: done getting next task for host managed_node2 26764 1726882723.67292: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 26764 1726882723.67295: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882723.67309: getting variables 26764 1726882723.67310: in VariableManager get_vars() 26764 1726882723.67346: Calling all_inventory to load vars for managed_node2 26764 1726882723.67349: Calling groups_inventory to load vars for managed_node2 26764 1726882723.67352: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882723.67365: Calling all_plugins_play to load vars for managed_node2 26764 1726882723.67368: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882723.67371: Calling groups_plugins_play to load vars for managed_node2 26764 1726882723.68413: done sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000011 26764 1726882723.68416: WORKER PROCESS EXITING 26764 1726882723.69221: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882723.70934: done with get_vars() 26764 1726882723.70959: done getting variables 26764 1726882723.71019: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:38:43 -0400 (0:00:00.055) 0:00:09.652 ****** 26764 1726882723.71058: entering _queue_task() for managed_node2/fail 26764 1726882723.71339: worker is 1 (out of 1 available) 26764 1726882723.71353: exiting _queue_task() for managed_node2/fail 26764 1726882723.71369: done queuing things up, now waiting for results queue to drain 26764 1726882723.71371: waiting for pending results... 26764 1726882723.71650: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 26764 1726882723.71762: in run() - task 0e448fcc-3ce9-9875-c9a3-000000000012 26764 1726882723.71785: variable 'ansible_search_path' from source: unknown 26764 1726882723.71793: variable 'ansible_search_path' from source: unknown 26764 1726882723.71846: calling self._execute() 26764 1726882723.71947: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882723.71960: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882723.71979: variable 'omit' from source: magic vars 26764 1726882723.72368: variable 'ansible_distribution_major_version' from source: facts 26764 1726882723.72386: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882723.72516: variable 'network_state' from source: role '' defaults 26764 1726882723.72532: Evaluated conditional (network_state != {}): False 26764 1726882723.72540: when evaluation is False, skipping this task 26764 1726882723.72548: _execute() done 26764 1726882723.72555: dumping result to json 26764 1726882723.72564: done dumping result, returning 26764 1726882723.72585: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0e448fcc-3ce9-9875-c9a3-000000000012] 26764 1726882723.72597: sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000012 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 26764 1726882723.72742: no more pending results, returning what we have 26764 1726882723.72746: results queue empty 26764 1726882723.72747: checking for any_errors_fatal 26764 1726882723.72754: done checking for any_errors_fatal 26764 1726882723.72755: checking for max_fail_percentage 26764 1726882723.72757: done checking for max_fail_percentage 26764 1726882723.72758: checking to see if all hosts have failed and the running result is not ok 26764 1726882723.72759: done checking to see if all hosts have failed 26764 1726882723.72759: getting the remaining hosts for this loop 26764 1726882723.72761: done getting the remaining hosts for this loop 26764 1726882723.72766: getting the next task for host managed_node2 26764 1726882723.72778: done getting next task for host managed_node2 26764 1726882723.72782: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 26764 1726882723.72784: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882723.72799: getting variables 26764 1726882723.72803: in VariableManager get_vars() 26764 1726882723.72843: Calling all_inventory to load vars for managed_node2 26764 1726882723.72846: Calling groups_inventory to load vars for managed_node2 26764 1726882723.72848: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882723.72861: Calling all_plugins_play to load vars for managed_node2 26764 1726882723.72865: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882723.72869: Calling groups_plugins_play to load vars for managed_node2 26764 1726882723.73921: done sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000012 26764 1726882723.73925: WORKER PROCESS EXITING 26764 1726882723.74529: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882723.76266: done with get_vars() 26764 1726882723.76288: done getting variables 26764 1726882723.76354: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:38:43 -0400 (0:00:00.053) 0:00:09.705 ****** 26764 1726882723.76384: entering _queue_task() for managed_node2/fail 26764 1726882723.76641: worker is 1 (out of 1 available) 26764 1726882723.76659: exiting _queue_task() for managed_node2/fail 26764 1726882723.76678: done queuing things up, now waiting for results queue to drain 26764 1726882723.76680: waiting for pending results... 26764 1726882723.76944: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 26764 1726882723.77049: in run() - task 0e448fcc-3ce9-9875-c9a3-000000000013 26764 1726882723.77071: variable 'ansible_search_path' from source: unknown 26764 1726882723.77081: variable 'ansible_search_path' from source: unknown 26764 1726882723.77129: calling self._execute() 26764 1726882723.77220: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882723.77235: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882723.77251: variable 'omit' from source: magic vars 26764 1726882723.77619: variable 'ansible_distribution_major_version' from source: facts 26764 1726882723.77645: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882723.77833: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 26764 1726882723.80616: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 26764 1726882723.80702: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 26764 1726882723.80741: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 26764 1726882723.80780: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 26764 1726882723.80820: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 26764 1726882723.80904: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26764 1726882723.80942: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26764 1726882723.80976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26764 1726882723.81032: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26764 1726882723.81051: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26764 1726882723.81151: variable 'ansible_distribution_major_version' from source: facts 26764 1726882723.81172: Evaluated conditional (ansible_distribution_major_version | int > 9): False 26764 1726882723.81178: when evaluation is False, skipping this task 26764 1726882723.81185: _execute() done 26764 1726882723.81191: dumping result to json 26764 1726882723.81199: done dumping result, returning 26764 1726882723.81210: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0e448fcc-3ce9-9875-c9a3-000000000013] 26764 1726882723.81224: sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000013 26764 1726882723.81340: done sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000013 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 26764 1726882723.81390: no more pending results, returning what we have 26764 1726882723.81394: results queue empty 26764 1726882723.81395: checking for any_errors_fatal 26764 1726882723.81399: done checking for any_errors_fatal 26764 1726882723.81400: checking for max_fail_percentage 26764 1726882723.81402: done checking for max_fail_percentage 26764 1726882723.81403: checking to see if all hosts have failed and the running result is not ok 26764 1726882723.81404: done checking to see if all hosts have failed 26764 1726882723.81405: getting the remaining hosts for this loop 26764 1726882723.81406: done getting the remaining hosts for this loop 26764 1726882723.81410: getting the next task for host managed_node2 26764 1726882723.81417: done getting next task for host managed_node2 26764 1726882723.81421: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 26764 1726882723.81423: ^ state is: HOST STATE: block=3, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882723.81439: getting variables 26764 1726882723.81441: in VariableManager get_vars() 26764 1726882723.81483: Calling all_inventory to load vars for managed_node2 26764 1726882723.81487: Calling groups_inventory to load vars for managed_node2 26764 1726882723.81489: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882723.81500: Calling all_plugins_play to load vars for managed_node2 26764 1726882723.81503: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882723.81505: Calling groups_plugins_play to load vars for managed_node2 26764 1726882723.82517: WORKER PROCESS EXITING 26764 1726882723.83344: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882723.85105: done with get_vars() 26764 1726882723.85126: done getting variables 26764 1726882723.85222: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:38:43 -0400 (0:00:00.088) 0:00:09.794 ****** 26764 1726882723.85246: entering _queue_task() for managed_node2/dnf 26764 1726882723.85531: worker is 1 (out of 1 available) 26764 1726882723.85542: exiting _queue_task() for managed_node2/dnf 26764 1726882723.85555: done queuing things up, now waiting for results queue to drain 26764 1726882723.85556: waiting for pending results... 26764 1726882723.85836: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 26764 1726882723.85951: in run() - task 0e448fcc-3ce9-9875-c9a3-000000000014 26764 1726882723.85973: variable 'ansible_search_path' from source: unknown 26764 1726882723.85982: variable 'ansible_search_path' from source: unknown 26764 1726882723.86031: calling self._execute() 26764 1726882723.86119: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882723.86138: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882723.86155: variable 'omit' from source: magic vars 26764 1726882723.86537: variable 'ansible_distribution_major_version' from source: facts 26764 1726882723.86557: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882723.86773: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 26764 1726882723.89269: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 26764 1726882723.89345: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 26764 1726882723.89401: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 26764 1726882723.89441: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 26764 1726882723.89478: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 26764 1726882723.89572: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26764 1726882723.89619: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26764 1726882723.89652: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26764 1726882723.89704: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26764 1726882723.89734: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26764 1726882723.89865: variable 'ansible_distribution' from source: facts 26764 1726882723.89877: variable 'ansible_distribution_major_version' from source: facts 26764 1726882723.89896: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 26764 1726882723.90030: variable '__network_wireless_connections_defined' from source: role '' defaults 26764 1726882723.90183: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26764 1726882723.90211: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26764 1726882723.90244: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26764 1726882723.90302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26764 1726882723.90323: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26764 1726882723.90379: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26764 1726882723.90409: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26764 1726882723.90438: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26764 1726882723.90495: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26764 1726882723.90516: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26764 1726882723.90559: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26764 1726882723.90600: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26764 1726882723.90633: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26764 1726882723.90684: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26764 1726882723.90713: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26764 1726882723.90891: variable 'network_connections' from source: role '' defaults 26764 1726882723.90947: variable '__network_team_connections_defined' from source: role '' defaults 26764 1726882723.91189: variable 'network_connections' from source: role '' defaults 26764 1726882723.91208: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 26764 1726882723.91215: when evaluation is False, skipping this task 26764 1726882723.91221: _execute() done 26764 1726882723.91228: dumping result to json 26764 1726882723.91238: done dumping result, returning 26764 1726882723.91255: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0e448fcc-3ce9-9875-c9a3-000000000014] 26764 1726882723.91269: sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000014 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 26764 1726882723.91418: no more pending results, returning what we have 26764 1726882723.91429: results queue empty 26764 1726882723.91430: checking for any_errors_fatal 26764 1726882723.91435: done checking for any_errors_fatal 26764 1726882723.91436: checking for max_fail_percentage 26764 1726882723.91438: done checking for max_fail_percentage 26764 1726882723.91439: checking to see if all hosts have failed and the running result is not ok 26764 1726882723.91440: done checking to see if all hosts have failed 26764 1726882723.91440: getting the remaining hosts for this loop 26764 1726882723.91442: done getting the remaining hosts for this loop 26764 1726882723.91445: getting the next task for host managed_node2 26764 1726882723.91453: done getting next task for host managed_node2 26764 1726882723.91457: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 26764 1726882723.91458: ^ state is: HOST STATE: block=3, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882723.91474: getting variables 26764 1726882723.91476: in VariableManager get_vars() 26764 1726882723.91515: Calling all_inventory to load vars for managed_node2 26764 1726882723.91518: Calling groups_inventory to load vars for managed_node2 26764 1726882723.91520: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882723.91531: Calling all_plugins_play to load vars for managed_node2 26764 1726882723.91534: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882723.91536: Calling groups_plugins_play to load vars for managed_node2 26764 1726882723.92621: done sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000014 26764 1726882723.92624: WORKER PROCESS EXITING 26764 1726882723.93446: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882723.94965: done with get_vars() 26764 1726882723.94983: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 26764 1726882723.95038: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:38:43 -0400 (0:00:00.098) 0:00:09.892 ****** 26764 1726882723.95059: entering _queue_task() for managed_node2/yum 26764 1726882723.95061: Creating lock for yum 26764 1726882723.95284: worker is 1 (out of 1 available) 26764 1726882723.95298: exiting _queue_task() for managed_node2/yum 26764 1726882723.95310: done queuing things up, now waiting for results queue to drain 26764 1726882723.95311: waiting for pending results... 26764 1726882723.95494: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 26764 1726882723.95557: in run() - task 0e448fcc-3ce9-9875-c9a3-000000000015 26764 1726882723.95573: variable 'ansible_search_path' from source: unknown 26764 1726882723.95577: variable 'ansible_search_path' from source: unknown 26764 1726882723.95608: calling self._execute() 26764 1726882723.95673: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882723.95678: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882723.95690: variable 'omit' from source: magic vars 26764 1726882723.95957: variable 'ansible_distribution_major_version' from source: facts 26764 1726882723.95970: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882723.96092: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 26764 1726882723.98182: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 26764 1726882723.98225: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 26764 1726882723.98252: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 26764 1726882723.98284: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 26764 1726882723.98303: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 26764 1726882723.98359: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26764 1726882723.98385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26764 1726882723.98404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26764 1726882723.98430: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26764 1726882723.98440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26764 1726882723.98510: variable 'ansible_distribution_major_version' from source: facts 26764 1726882723.98523: Evaluated conditional (ansible_distribution_major_version | int < 8): False 26764 1726882723.98525: when evaluation is False, skipping this task 26764 1726882723.98528: _execute() done 26764 1726882723.98531: dumping result to json 26764 1726882723.98535: done dumping result, returning 26764 1726882723.98542: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0e448fcc-3ce9-9875-c9a3-000000000015] 26764 1726882723.98547: sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000015 26764 1726882723.98632: done sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000015 26764 1726882723.98635: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 26764 1726882723.98691: no more pending results, returning what we have 26764 1726882723.98695: results queue empty 26764 1726882723.98696: checking for any_errors_fatal 26764 1726882723.98702: done checking for any_errors_fatal 26764 1726882723.98703: checking for max_fail_percentage 26764 1726882723.98704: done checking for max_fail_percentage 26764 1726882723.98705: checking to see if all hosts have failed and the running result is not ok 26764 1726882723.98706: done checking to see if all hosts have failed 26764 1726882723.98707: getting the remaining hosts for this loop 26764 1726882723.98708: done getting the remaining hosts for this loop 26764 1726882723.98711: getting the next task for host managed_node2 26764 1726882723.98718: done getting next task for host managed_node2 26764 1726882723.98722: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 26764 1726882723.98724: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882723.98736: getting variables 26764 1726882723.98738: in VariableManager get_vars() 26764 1726882723.98775: Calling all_inventory to load vars for managed_node2 26764 1726882723.98779: Calling groups_inventory to load vars for managed_node2 26764 1726882723.98782: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882723.98790: Calling all_plugins_play to load vars for managed_node2 26764 1726882723.98792: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882723.98794: Calling groups_plugins_play to load vars for managed_node2 26764 1726882723.99596: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882724.01339: done with get_vars() 26764 1726882724.01377: done getting variables 26764 1726882724.01500: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:38:44 -0400 (0:00:00.064) 0:00:09.957 ****** 26764 1726882724.01537: entering _queue_task() for managed_node2/fail 26764 1726882724.01977: worker is 1 (out of 1 available) 26764 1726882724.01990: exiting _queue_task() for managed_node2/fail 26764 1726882724.02003: done queuing things up, now waiting for results queue to drain 26764 1726882724.02004: waiting for pending results... 26764 1726882724.02391: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 26764 1726882724.02506: in run() - task 0e448fcc-3ce9-9875-c9a3-000000000016 26764 1726882724.02531: variable 'ansible_search_path' from source: unknown 26764 1726882724.02539: variable 'ansible_search_path' from source: unknown 26764 1726882724.02587: calling self._execute() 26764 1726882724.02690: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882724.02703: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882724.02721: variable 'omit' from source: magic vars 26764 1726882724.03134: variable 'ansible_distribution_major_version' from source: facts 26764 1726882724.03153: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882724.03299: variable '__network_wireless_connections_defined' from source: role '' defaults 26764 1726882724.03540: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 26764 1726882724.06669: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 26764 1726882724.06751: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 26764 1726882724.06786: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 26764 1726882724.06834: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 26764 1726882724.06857: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 26764 1726882724.06926: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26764 1726882724.06948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26764 1726882724.06975: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26764 1726882724.07001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26764 1726882724.07016: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26764 1726882724.07052: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26764 1726882724.07075: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26764 1726882724.07092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26764 1726882724.07121: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26764 1726882724.07132: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26764 1726882724.07157: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26764 1726882724.07178: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26764 1726882724.07194: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26764 1726882724.07218: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26764 1726882724.07231: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26764 1726882724.07348: variable 'network_connections' from source: role '' defaults 26764 1726882724.07390: variable '__network_team_connections_defined' from source: role '' defaults 26764 1726882724.07546: variable 'network_connections' from source: role '' defaults 26764 1726882724.07563: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 26764 1726882724.07566: when evaluation is False, skipping this task 26764 1726882724.07572: _execute() done 26764 1726882724.07574: dumping result to json 26764 1726882724.07577: done dumping result, returning 26764 1726882724.07585: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-9875-c9a3-000000000016] 26764 1726882724.07590: sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000016 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 26764 1726882724.07728: no more pending results, returning what we have 26764 1726882724.07731: results queue empty 26764 1726882724.07732: checking for any_errors_fatal 26764 1726882724.07738: done checking for any_errors_fatal 26764 1726882724.07738: checking for max_fail_percentage 26764 1726882724.07740: done checking for max_fail_percentage 26764 1726882724.07741: checking to see if all hosts have failed and the running result is not ok 26764 1726882724.07742: done checking to see if all hosts have failed 26764 1726882724.07742: getting the remaining hosts for this loop 26764 1726882724.07744: done getting the remaining hosts for this loop 26764 1726882724.07747: getting the next task for host managed_node2 26764 1726882724.07754: done getting next task for host managed_node2 26764 1726882724.07758: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 26764 1726882724.07759: ^ state is: HOST STATE: block=3, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882724.07773: getting variables 26764 1726882724.07775: in VariableManager get_vars() 26764 1726882724.07814: Calling all_inventory to load vars for managed_node2 26764 1726882724.07816: Calling groups_inventory to load vars for managed_node2 26764 1726882724.07819: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882724.07834: Calling all_plugins_play to load vars for managed_node2 26764 1726882724.07837: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882724.07840: Calling groups_plugins_play to load vars for managed_node2 26764 1726882724.08933: done sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000016 26764 1726882724.08937: WORKER PROCESS EXITING 26764 1726882724.09032: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882724.10302: done with get_vars() 26764 1726882724.10317: done getting variables 26764 1726882724.10361: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:38:44 -0400 (0:00:00.088) 0:00:10.045 ****** 26764 1726882724.10384: entering _queue_task() for managed_node2/package 26764 1726882724.10597: worker is 1 (out of 1 available) 26764 1726882724.10611: exiting _queue_task() for managed_node2/package 26764 1726882724.10624: done queuing things up, now waiting for results queue to drain 26764 1726882724.10625: waiting for pending results... 26764 1726882724.10793: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 26764 1726882724.10860: in run() - task 0e448fcc-3ce9-9875-c9a3-000000000017 26764 1726882724.10875: variable 'ansible_search_path' from source: unknown 26764 1726882724.10879: variable 'ansible_search_path' from source: unknown 26764 1726882724.10907: calling self._execute() 26764 1726882724.10971: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882724.10977: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882724.10990: variable 'omit' from source: magic vars 26764 1726882724.11250: variable 'ansible_distribution_major_version' from source: facts 26764 1726882724.11259: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882724.11395: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 26764 1726882724.11584: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 26764 1726882724.11616: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 26764 1726882724.11642: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 26764 1726882724.11671: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 26764 1726882724.11742: variable 'network_packages' from source: role '' defaults 26764 1726882724.11816: variable '__network_provider_setup' from source: role '' defaults 26764 1726882724.11823: variable '__network_service_name_default_nm' from source: role '' defaults 26764 1726882724.11876: variable '__network_service_name_default_nm' from source: role '' defaults 26764 1726882724.11884: variable '__network_packages_default_nm' from source: role '' defaults 26764 1726882724.11926: variable '__network_packages_default_nm' from source: role '' defaults 26764 1726882724.12041: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 26764 1726882724.13439: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 26764 1726882724.13493: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 26764 1726882724.13519: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 26764 1726882724.13542: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 26764 1726882724.13561: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 26764 1726882724.13622: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26764 1726882724.13640: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26764 1726882724.13658: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26764 1726882724.13692: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26764 1726882724.13703: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26764 1726882724.13733: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26764 1726882724.13749: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26764 1726882724.13769: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26764 1726882724.13793: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26764 1726882724.13807: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26764 1726882724.13949: variable '__network_packages_default_gobject_packages' from source: role '' defaults 26764 1726882724.14019: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26764 1726882724.14038: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26764 1726882724.14055: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26764 1726882724.14083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26764 1726882724.14093: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26764 1726882724.14153: variable 'ansible_python' from source: facts 26764 1726882724.14175: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 26764 1726882724.14228: variable '__network_wpa_supplicant_required' from source: role '' defaults 26764 1726882724.14286: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 26764 1726882724.14372: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26764 1726882724.14387: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26764 1726882724.14403: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26764 1726882724.14428: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26764 1726882724.14440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26764 1726882724.14476: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26764 1726882724.14496: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26764 1726882724.14513: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26764 1726882724.14537: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26764 1726882724.14549: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26764 1726882724.14643: variable 'network_connections' from source: role '' defaults 26764 1726882724.14681: variable '__network_wireless_connections_defined' from source: role '' defaults 26764 1726882724.14859: variable 'network_connections' from source: role '' defaults 26764 1726882724.14879: variable '__network_packages_default_wireless' from source: role '' defaults 26764 1726882724.14934: variable '__network_wireless_connections_defined' from source: role '' defaults 26764 1726882724.15133: variable 'network_connections' from source: role '' defaults 26764 1726882724.15144: variable '__network_packages_default_team' from source: role '' defaults 26764 1726882724.15200: variable '__network_team_connections_defined' from source: role '' defaults 26764 1726882724.15397: variable 'network_connections' from source: role '' defaults 26764 1726882724.15435: variable '__network_service_name_default_initscripts' from source: role '' defaults 26764 1726882724.15479: variable '__network_service_name_default_initscripts' from source: role '' defaults 26764 1726882724.15485: variable '__network_packages_default_initscripts' from source: role '' defaults 26764 1726882724.15526: variable '__network_packages_default_initscripts' from source: role '' defaults 26764 1726882724.15669: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 26764 1726882724.15971: variable 'network_connections' from source: role '' defaults 26764 1726882724.15979: variable 'ansible_distribution' from source: facts 26764 1726882724.15982: variable '__network_rh_distros' from source: role '' defaults 26764 1726882724.15984: variable 'ansible_distribution_major_version' from source: facts 26764 1726882724.15996: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 26764 1726882724.16104: variable 'ansible_distribution' from source: facts 26764 1726882724.16108: variable '__network_rh_distros' from source: role '' defaults 26764 1726882724.16110: variable 'ansible_distribution_major_version' from source: facts 26764 1726882724.16119: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 26764 1726882724.16226: variable 'ansible_distribution' from source: facts 26764 1726882724.16229: variable '__network_rh_distros' from source: role '' defaults 26764 1726882724.16234: variable 'ansible_distribution_major_version' from source: facts 26764 1726882724.16258: variable 'network_provider' from source: set_fact 26764 1726882724.16271: variable 'ansible_facts' from source: unknown 26764 1726882724.16712: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 26764 1726882724.16715: when evaluation is False, skipping this task 26764 1726882724.16721: _execute() done 26764 1726882724.16724: dumping result to json 26764 1726882724.16726: done dumping result, returning 26764 1726882724.16733: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [0e448fcc-3ce9-9875-c9a3-000000000017] 26764 1726882724.16736: sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000017 26764 1726882724.16816: done sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000017 26764 1726882724.16819: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 26764 1726882724.16900: no more pending results, returning what we have 26764 1726882724.16904: results queue empty 26764 1726882724.16905: checking for any_errors_fatal 26764 1726882724.16909: done checking for any_errors_fatal 26764 1726882724.16909: checking for max_fail_percentage 26764 1726882724.16911: done checking for max_fail_percentage 26764 1726882724.16912: checking to see if all hosts have failed and the running result is not ok 26764 1726882724.16913: done checking to see if all hosts have failed 26764 1726882724.16913: getting the remaining hosts for this loop 26764 1726882724.16914: done getting the remaining hosts for this loop 26764 1726882724.16918: getting the next task for host managed_node2 26764 1726882724.16923: done getting next task for host managed_node2 26764 1726882724.16927: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 26764 1726882724.16929: ^ state is: HOST STATE: block=3, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882724.16949: getting variables 26764 1726882724.16950: in VariableManager get_vars() 26764 1726882724.16988: Calling all_inventory to load vars for managed_node2 26764 1726882724.16991: Calling groups_inventory to load vars for managed_node2 26764 1726882724.16992: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882724.17001: Calling all_plugins_play to load vars for managed_node2 26764 1726882724.17003: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882724.17006: Calling groups_plugins_play to load vars for managed_node2 26764 1726882724.17787: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882724.18813: done with get_vars() 26764 1726882724.18827: done getting variables 26764 1726882724.18870: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:38:44 -0400 (0:00:00.085) 0:00:10.130 ****** 26764 1726882724.18891: entering _queue_task() for managed_node2/package 26764 1726882724.19094: worker is 1 (out of 1 available) 26764 1726882724.19108: exiting _queue_task() for managed_node2/package 26764 1726882724.19120: done queuing things up, now waiting for results queue to drain 26764 1726882724.19121: waiting for pending results... 26764 1726882724.19285: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 26764 1726882724.19348: in run() - task 0e448fcc-3ce9-9875-c9a3-000000000018 26764 1726882724.19358: variable 'ansible_search_path' from source: unknown 26764 1726882724.19362: variable 'ansible_search_path' from source: unknown 26764 1726882724.19391: calling self._execute() 26764 1726882724.19452: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882724.19456: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882724.19468: variable 'omit' from source: magic vars 26764 1726882724.19716: variable 'ansible_distribution_major_version' from source: facts 26764 1726882724.19726: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882724.19809: variable 'network_state' from source: role '' defaults 26764 1726882724.19818: Evaluated conditional (network_state != {}): False 26764 1726882724.19821: when evaluation is False, skipping this task 26764 1726882724.19823: _execute() done 26764 1726882724.19825: dumping result to json 26764 1726882724.19830: done dumping result, returning 26764 1726882724.19837: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0e448fcc-3ce9-9875-c9a3-000000000018] 26764 1726882724.19842: sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000018 26764 1726882724.19929: done sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000018 26764 1726882724.19931: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 26764 1726882724.19986: no more pending results, returning what we have 26764 1726882724.19989: results queue empty 26764 1726882724.19990: checking for any_errors_fatal 26764 1726882724.19996: done checking for any_errors_fatal 26764 1726882724.19996: checking for max_fail_percentage 26764 1726882724.19998: done checking for max_fail_percentage 26764 1726882724.19999: checking to see if all hosts have failed and the running result is not ok 26764 1726882724.19999: done checking to see if all hosts have failed 26764 1726882724.20000: getting the remaining hosts for this loop 26764 1726882724.20001: done getting the remaining hosts for this loop 26764 1726882724.20004: getting the next task for host managed_node2 26764 1726882724.20009: done getting next task for host managed_node2 26764 1726882724.20012: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 26764 1726882724.20014: ^ state is: HOST STATE: block=3, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882724.20026: getting variables 26764 1726882724.20028: in VariableManager get_vars() 26764 1726882724.20069: Calling all_inventory to load vars for managed_node2 26764 1726882724.20072: Calling groups_inventory to load vars for managed_node2 26764 1726882724.20073: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882724.20080: Calling all_plugins_play to load vars for managed_node2 26764 1726882724.20081: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882724.20083: Calling groups_plugins_play to load vars for managed_node2 26764 1726882724.20828: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882724.21744: done with get_vars() 26764 1726882724.21759: done getting variables 26764 1726882724.21802: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:38:44 -0400 (0:00:00.029) 0:00:10.159 ****** 26764 1726882724.21821: entering _queue_task() for managed_node2/package 26764 1726882724.21993: worker is 1 (out of 1 available) 26764 1726882724.22004: exiting _queue_task() for managed_node2/package 26764 1726882724.22017: done queuing things up, now waiting for results queue to drain 26764 1726882724.22018: waiting for pending results... 26764 1726882724.22182: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 26764 1726882724.22241: in run() - task 0e448fcc-3ce9-9875-c9a3-000000000019 26764 1726882724.22252: variable 'ansible_search_path' from source: unknown 26764 1726882724.22255: variable 'ansible_search_path' from source: unknown 26764 1726882724.22283: calling self._execute() 26764 1726882724.22343: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882724.22348: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882724.22358: variable 'omit' from source: magic vars 26764 1726882724.22597: variable 'ansible_distribution_major_version' from source: facts 26764 1726882724.22606: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882724.22686: variable 'network_state' from source: role '' defaults 26764 1726882724.22694: Evaluated conditional (network_state != {}): False 26764 1726882724.22697: when evaluation is False, skipping this task 26764 1726882724.22700: _execute() done 26764 1726882724.22702: dumping result to json 26764 1726882724.22705: done dumping result, returning 26764 1726882724.22714: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0e448fcc-3ce9-9875-c9a3-000000000019] 26764 1726882724.22760: sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000019 26764 1726882724.22835: done sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000019 26764 1726882724.22838: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 26764 1726882724.22884: no more pending results, returning what we have 26764 1726882724.22887: results queue empty 26764 1726882724.22888: checking for any_errors_fatal 26764 1726882724.22891: done checking for any_errors_fatal 26764 1726882724.22892: checking for max_fail_percentage 26764 1726882724.22893: done checking for max_fail_percentage 26764 1726882724.22894: checking to see if all hosts have failed and the running result is not ok 26764 1726882724.22895: done checking to see if all hosts have failed 26764 1726882724.22896: getting the remaining hosts for this loop 26764 1726882724.22897: done getting the remaining hosts for this loop 26764 1726882724.22899: getting the next task for host managed_node2 26764 1726882724.22904: done getting next task for host managed_node2 26764 1726882724.22907: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 26764 1726882724.22908: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882724.22917: getting variables 26764 1726882724.22918: in VariableManager get_vars() 26764 1726882724.22939: Calling all_inventory to load vars for managed_node2 26764 1726882724.22941: Calling groups_inventory to load vars for managed_node2 26764 1726882724.22942: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882724.22948: Calling all_plugins_play to load vars for managed_node2 26764 1726882724.22950: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882724.22951: Calling groups_plugins_play to load vars for managed_node2 26764 1726882724.26419: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882724.27552: done with get_vars() 26764 1726882724.27574: done getting variables 26764 1726882724.27651: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:38:44 -0400 (0:00:00.058) 0:00:10.218 ****** 26764 1726882724.27677: entering _queue_task() for managed_node2/service 26764 1726882724.27679: Creating lock for service 26764 1726882724.27968: worker is 1 (out of 1 available) 26764 1726882724.27980: exiting _queue_task() for managed_node2/service 26764 1726882724.27991: done queuing things up, now waiting for results queue to drain 26764 1726882724.27992: waiting for pending results... 26764 1726882724.28251: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 26764 1726882724.28341: in run() - task 0e448fcc-3ce9-9875-c9a3-00000000001a 26764 1726882724.28353: variable 'ansible_search_path' from source: unknown 26764 1726882724.28357: variable 'ansible_search_path' from source: unknown 26764 1726882724.28394: calling self._execute() 26764 1726882724.28479: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882724.28484: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882724.28497: variable 'omit' from source: magic vars 26764 1726882724.28850: variable 'ansible_distribution_major_version' from source: facts 26764 1726882724.28869: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882724.28974: variable '__network_wireless_connections_defined' from source: role '' defaults 26764 1726882724.29169: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 26764 1726882724.31490: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 26764 1726882724.31561: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 26764 1726882724.31605: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 26764 1726882724.31639: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 26764 1726882724.31668: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 26764 1726882724.31738: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26764 1726882724.31767: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26764 1726882724.31790: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26764 1726882724.31833: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26764 1726882724.31846: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26764 1726882724.31889: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26764 1726882724.31914: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26764 1726882724.31936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26764 1726882724.31975: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26764 1726882724.31988: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26764 1726882724.32029: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26764 1726882724.32048: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26764 1726882724.32072: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26764 1726882724.32109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26764 1726882724.32124: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26764 1726882724.32291: variable 'network_connections' from source: role '' defaults 26764 1726882724.32338: variable '__network_team_connections_defined' from source: role '' defaults 26764 1726882724.32583: variable 'network_connections' from source: role '' defaults 26764 1726882724.32599: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 26764 1726882724.32602: when evaluation is False, skipping this task 26764 1726882724.32606: _execute() done 26764 1726882724.32608: dumping result to json 26764 1726882724.32610: done dumping result, returning 26764 1726882724.32619: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-9875-c9a3-00000000001a] 26764 1726882724.32621: sending task result for task 0e448fcc-3ce9-9875-c9a3-00000000001a 26764 1726882724.32719: done sending task result for task 0e448fcc-3ce9-9875-c9a3-00000000001a 26764 1726882724.32723: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 26764 1726882724.32767: no more pending results, returning what we have 26764 1726882724.32771: results queue empty 26764 1726882724.32772: checking for any_errors_fatal 26764 1726882724.32780: done checking for any_errors_fatal 26764 1726882724.32780: checking for max_fail_percentage 26764 1726882724.32782: done checking for max_fail_percentage 26764 1726882724.32783: checking to see if all hosts have failed and the running result is not ok 26764 1726882724.32783: done checking to see if all hosts have failed 26764 1726882724.32784: getting the remaining hosts for this loop 26764 1726882724.32785: done getting the remaining hosts for this loop 26764 1726882724.32789: getting the next task for host managed_node2 26764 1726882724.32796: done getting next task for host managed_node2 26764 1726882724.32799: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 26764 1726882724.32801: ^ state is: HOST STATE: block=3, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882724.32812: getting variables 26764 1726882724.32814: in VariableManager get_vars() 26764 1726882724.32848: Calling all_inventory to load vars for managed_node2 26764 1726882724.32850: Calling groups_inventory to load vars for managed_node2 26764 1726882724.32859: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882724.32872: Calling all_plugins_play to load vars for managed_node2 26764 1726882724.32875: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882724.32878: Calling groups_plugins_play to load vars for managed_node2 26764 1726882724.34420: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882724.36072: done with get_vars() 26764 1726882724.36091: done getting variables 26764 1726882724.36141: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:38:44 -0400 (0:00:00.084) 0:00:10.303 ****** 26764 1726882724.36167: entering _queue_task() for managed_node2/service 26764 1726882724.36426: worker is 1 (out of 1 available) 26764 1726882724.36439: exiting _queue_task() for managed_node2/service 26764 1726882724.36452: done queuing things up, now waiting for results queue to drain 26764 1726882724.36453: waiting for pending results... 26764 1726882724.36726: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 26764 1726882724.36826: in run() - task 0e448fcc-3ce9-9875-c9a3-00000000001b 26764 1726882724.36840: variable 'ansible_search_path' from source: unknown 26764 1726882724.36843: variable 'ansible_search_path' from source: unknown 26764 1726882724.36882: calling self._execute() 26764 1726882724.36969: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882724.36973: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882724.36983: variable 'omit' from source: magic vars 26764 1726882724.37366: variable 'ansible_distribution_major_version' from source: facts 26764 1726882724.37376: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882724.37536: variable 'network_provider' from source: set_fact 26764 1726882724.37540: variable 'network_state' from source: role '' defaults 26764 1726882724.37556: Evaluated conditional (network_provider == "nm" or network_state != {}): True 26764 1726882724.37566: variable 'omit' from source: magic vars 26764 1726882724.37597: variable 'omit' from source: magic vars 26764 1726882724.37622: variable 'network_service_name' from source: role '' defaults 26764 1726882724.37694: variable 'network_service_name' from source: role '' defaults 26764 1726882724.37801: variable '__network_provider_setup' from source: role '' defaults 26764 1726882724.37806: variable '__network_service_name_default_nm' from source: role '' defaults 26764 1726882724.37872: variable '__network_service_name_default_nm' from source: role '' defaults 26764 1726882724.37882: variable '__network_packages_default_nm' from source: role '' defaults 26764 1726882724.37942: variable '__network_packages_default_nm' from source: role '' defaults 26764 1726882724.38167: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 26764 1726882724.40661: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 26764 1726882724.40733: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 26764 1726882724.40770: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 26764 1726882724.40801: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 26764 1726882724.40832: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 26764 1726882724.40905: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26764 1726882724.40938: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26764 1726882724.40962: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26764 1726882724.41004: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26764 1726882724.41020: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26764 1726882724.41070: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26764 1726882724.41093: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26764 1726882724.41117: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26764 1726882724.41160: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26764 1726882724.41176: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26764 1726882724.41396: variable '__network_packages_default_gobject_packages' from source: role '' defaults 26764 1726882724.41506: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26764 1726882724.41529: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26764 1726882724.41553: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26764 1726882724.41600: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26764 1726882724.41610: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26764 1726882724.41696: variable 'ansible_python' from source: facts 26764 1726882724.41719: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 26764 1726882724.42074: variable '__network_wpa_supplicant_required' from source: role '' defaults 26764 1726882724.42077: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 26764 1726882724.42080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26764 1726882724.42083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26764 1726882724.42086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26764 1726882724.42088: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26764 1726882724.42101: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26764 1726882724.42151: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26764 1726882724.42175: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26764 1726882724.42199: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26764 1726882724.42240: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26764 1726882724.42254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26764 1726882724.42389: variable 'network_connections' from source: role '' defaults 26764 1726882724.42432: variable '__network_wireless_connections_defined' from source: role '' defaults 26764 1726882724.42691: variable 'network_connections' from source: role '' defaults 26764 1726882724.42711: variable '__network_packages_default_wireless' from source: role '' defaults 26764 1726882724.42792: variable '__network_wireless_connections_defined' from source: role '' defaults 26764 1726882724.43057: variable 'network_connections' from source: role '' defaults 26764 1726882724.43073: variable '__network_packages_default_team' from source: role '' defaults 26764 1726882724.43151: variable '__network_team_connections_defined' from source: role '' defaults 26764 1726882724.43451: variable 'network_connections' from source: role '' defaults 26764 1726882724.43500: variable '__network_service_name_default_initscripts' from source: role '' defaults 26764 1726882724.43563: variable '__network_service_name_default_initscripts' from source: role '' defaults 26764 1726882724.43570: variable '__network_packages_default_initscripts' from source: role '' defaults 26764 1726882724.43630: variable '__network_packages_default_initscripts' from source: role '' defaults 26764 1726882724.43867: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 26764 1726882724.44138: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 26764 1726882724.44316: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 26764 1726882724.44359: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 26764 1726882724.44400: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 26764 1726882724.44434: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 26764 1726882724.44739: variable 'network_connections' from source: role '' defaults 26764 1726882724.44742: variable 'ansible_distribution' from source: facts 26764 1726882724.44744: variable '__network_rh_distros' from source: role '' defaults 26764 1726882724.44751: variable 'ansible_distribution_major_version' from source: facts 26764 1726882724.44762: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 26764 1726882724.44941: variable 'ansible_distribution' from source: facts 26764 1726882724.44945: variable '__network_rh_distros' from source: role '' defaults 26764 1726882724.44950: variable 'ansible_distribution_major_version' from source: facts 26764 1726882724.44962: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 26764 1726882724.45137: variable 'ansible_distribution' from source: facts 26764 1726882724.45141: variable '__network_rh_distros' from source: role '' defaults 26764 1726882724.45144: variable 'ansible_distribution_major_version' from source: facts 26764 1726882724.45185: variable 'network_provider' from source: set_fact 26764 1726882724.45205: variable 'omit' from source: magic vars 26764 1726882724.45229: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26764 1726882724.45254: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26764 1726882724.45277: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26764 1726882724.45294: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882724.45304: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882724.45330: variable 'inventory_hostname' from source: host vars for 'managed_node2' 26764 1726882724.45333: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882724.45336: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882724.45431: Set connection var ansible_shell_executable to /bin/sh 26764 1726882724.45434: Set connection var ansible_shell_type to sh 26764 1726882724.45444: Set connection var ansible_module_compression to ZIP_DEFLATED 26764 1726882724.45449: Set connection var ansible_timeout to 10 26764 1726882724.45455: Set connection var ansible_connection to ssh 26764 1726882724.45458: Set connection var ansible_pipelining to False 26764 1726882724.45488: variable 'ansible_shell_executable' from source: unknown 26764 1726882724.45491: variable 'ansible_connection' from source: unknown 26764 1726882724.45494: variable 'ansible_module_compression' from source: unknown 26764 1726882724.45496: variable 'ansible_shell_type' from source: unknown 26764 1726882724.45498: variable 'ansible_shell_executable' from source: unknown 26764 1726882724.45500: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882724.45505: variable 'ansible_pipelining' from source: unknown 26764 1726882724.45507: variable 'ansible_timeout' from source: unknown 26764 1726882724.45511: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882724.45613: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 26764 1726882724.45622: variable 'omit' from source: magic vars 26764 1726882724.45628: starting attempt loop 26764 1726882724.45631: running the handler 26764 1726882724.45710: variable 'ansible_facts' from source: unknown 26764 1726882724.46448: _low_level_execute_command(): starting 26764 1726882724.46461: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 26764 1726882724.47206: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26764 1726882724.47221: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882724.47233: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882724.47247: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882724.47289: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882724.47296: stderr chunk (state=3): >>>debug2: match not found <<< 26764 1726882724.47308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882724.47325: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882724.47331: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882724.47338: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26764 1726882724.47346: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882724.47355: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882724.47369: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882724.47378: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882724.47383: stderr chunk (state=3): >>>debug2: match found <<< 26764 1726882724.47392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882724.47481: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882724.47485: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882724.47503: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882724.47635: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882724.49319: stdout chunk (state=3): >>>/root <<< 26764 1726882724.49439: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882724.49537: stderr chunk (state=3): >>><<< 26764 1726882724.49551: stdout chunk (state=3): >>><<< 26764 1726882724.49683: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882724.49687: _low_level_execute_command(): starting 26764 1726882724.49689: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882724.4958167-27262-7216107140350 `" && echo ansible-tmp-1726882724.4958167-27262-7216107140350="` echo /root/.ansible/tmp/ansible-tmp-1726882724.4958167-27262-7216107140350 `" ) && sleep 0' 26764 1726882724.50375: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26764 1726882724.50391: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882724.50405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882724.50429: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882724.50479: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882724.50492: stderr chunk (state=3): >>>debug2: match not found <<< 26764 1726882724.50506: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882724.50531: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882724.50558: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882724.50575: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26764 1726882724.50588: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882724.50601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882724.50617: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882724.50630: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882724.50643: stderr chunk (state=3): >>>debug2: match found <<< 26764 1726882724.50672: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882724.50747: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882724.50787: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882724.50804: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882724.50937: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882724.52879: stdout chunk (state=3): >>>ansible-tmp-1726882724.4958167-27262-7216107140350=/root/.ansible/tmp/ansible-tmp-1726882724.4958167-27262-7216107140350 <<< 26764 1726882724.53068: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882724.53072: stdout chunk (state=3): >>><<< 26764 1726882724.53075: stderr chunk (state=3): >>><<< 26764 1726882724.53171: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882724.4958167-27262-7216107140350=/root/.ansible/tmp/ansible-tmp-1726882724.4958167-27262-7216107140350 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882724.53175: variable 'ansible_module_compression' from source: unknown 26764 1726882724.53377: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 26764 1726882724.53381: ANSIBALLZ: Acquiring lock 26764 1726882724.53383: ANSIBALLZ: Lock acquired: 140693693673600 26764 1726882724.53385: ANSIBALLZ: Creating module 26764 1726882724.87621: ANSIBALLZ: Writing module into payload 26764 1726882724.87895: ANSIBALLZ: Writing module 26764 1726882724.87968: ANSIBALLZ: Renaming module 26764 1726882724.87989: ANSIBALLZ: Done creating module 26764 1726882724.88040: variable 'ansible_facts' from source: unknown 26764 1726882724.88281: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882724.4958167-27262-7216107140350/AnsiballZ_systemd.py 26764 1726882724.89106: Sending initial data 26764 1726882724.89109: Sent initial data (154 bytes) 26764 1726882724.90359: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26764 1726882724.90379: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882724.90406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882724.90430: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882724.90479: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882724.90495: stderr chunk (state=3): >>>debug2: match not found <<< 26764 1726882724.90518: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882724.90540: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882724.90552: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882724.90563: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26764 1726882724.90579: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882724.90592: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882724.90610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882724.90626: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882724.90640: stderr chunk (state=3): >>>debug2: match found <<< 26764 1726882724.90652: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882724.90739: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882724.90768: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882724.90784: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882724.90919: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882724.92767: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 26764 1726882724.92855: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 26764 1726882724.92957: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-26764trh16hvb/tmp87ap521a /root/.ansible/tmp/ansible-tmp-1726882724.4958167-27262-7216107140350/AnsiballZ_systemd.py <<< 26764 1726882724.93052: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 26764 1726882724.96885: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882724.97120: stderr chunk (state=3): >>><<< 26764 1726882724.97124: stdout chunk (state=3): >>><<< 26764 1726882724.97126: done transferring module to remote 26764 1726882724.97128: _low_level_execute_command(): starting 26764 1726882724.97130: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882724.4958167-27262-7216107140350/ /root/.ansible/tmp/ansible-tmp-1726882724.4958167-27262-7216107140350/AnsiballZ_systemd.py && sleep 0' 26764 1726882724.98521: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26764 1726882724.98531: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882724.98546: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882724.98553: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882724.98598: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882724.98605: stderr chunk (state=3): >>>debug2: match not found <<< 26764 1726882724.98614: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882724.98628: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882724.98637: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882724.98642: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26764 1726882724.98653: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882724.98656: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882724.98678: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882724.98687: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882724.98693: stderr chunk (state=3): >>>debug2: match found <<< 26764 1726882724.98702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882724.98779: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882724.98987: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882724.98990: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882724.99295: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882725.01239: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882725.01243: stdout chunk (state=3): >>><<< 26764 1726882725.01250: stderr chunk (state=3): >>><<< 26764 1726882725.01266: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882725.01272: _low_level_execute_command(): starting 26764 1726882725.01285: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882724.4958167-27262-7216107140350/AnsiballZ_systemd.py && sleep 0' 26764 1726882725.02896: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26764 1726882725.02905: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882725.02917: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882725.03086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882725.03124: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882725.03134: stderr chunk (state=3): >>>debug2: match not found <<< 26764 1726882725.03141: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882725.03154: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882725.03161: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882725.03172: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26764 1726882725.03180: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882725.03189: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882725.03200: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882725.03207: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882725.03213: stderr chunk (state=3): >>>debug2: match found <<< 26764 1726882725.03222: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882725.03296: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882725.03309: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882725.03319: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882725.03604: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882725.28802: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6692", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ExecMainStartTimestampMonotonic": "202392137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6692", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3602", "MemoryCurrent": "9146368", "MemoryAvailable": "infinity", "CPUUsageNSec": "1834777000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft"<<< 26764 1726882725.28825: stdout chunk (state=3): >>>: "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service multi-user.target network.target shutdown.target cloud-init.service", "After": "cloud-init-local.service dbus-broker.service network-pre.target system.slice dbus.socket systemd-journald.socket basic.target sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:57 EDT", "StateChangeTimestampMonotonic": "316658837", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveExitTimestampMonotonic": "202392395", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveEnterTimestampMonotonic": "202472383", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveExitTimestampMonotonic": "202362940", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveEnterTimestampMonotonic": "202381901", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ConditionTimestampMonotonic": "202382734", "AssertTimestamp": "Fri 2024-09-20 21:31:03 EDT", "AssertTimestampMonotonic": "202382737", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "55e27919215348fab37a11b7ea324f90", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 26764 1726882725.30333: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 26764 1726882725.30379: stderr chunk (state=3): >>><<< 26764 1726882725.30383: stdout chunk (state=3): >>><<< 26764 1726882725.30398: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6692", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ExecMainStartTimestampMonotonic": "202392137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6692", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3602", "MemoryCurrent": "9146368", "MemoryAvailable": "infinity", "CPUUsageNSec": "1834777000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service multi-user.target network.target shutdown.target cloud-init.service", "After": "cloud-init-local.service dbus-broker.service network-pre.target system.slice dbus.socket systemd-journald.socket basic.target sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:57 EDT", "StateChangeTimestampMonotonic": "316658837", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveExitTimestampMonotonic": "202392395", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveEnterTimestampMonotonic": "202472383", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveExitTimestampMonotonic": "202362940", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveEnterTimestampMonotonic": "202381901", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ConditionTimestampMonotonic": "202382734", "AssertTimestamp": "Fri 2024-09-20 21:31:03 EDT", "AssertTimestampMonotonic": "202382737", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "55e27919215348fab37a11b7ea324f90", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 26764 1726882725.30512: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882724.4958167-27262-7216107140350/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 26764 1726882725.30527: _low_level_execute_command(): starting 26764 1726882725.30533: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882724.4958167-27262-7216107140350/ > /dev/null 2>&1 && sleep 0' 26764 1726882725.31083: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26764 1726882725.31095: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882725.31102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882725.31116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882725.31150: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882725.31157: stderr chunk (state=3): >>>debug2: match not found <<< 26764 1726882725.31173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882725.31183: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882725.31190: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882725.31198: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26764 1726882725.31206: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882725.31213: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882725.31224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882725.31232: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882725.31238: stderr chunk (state=3): >>>debug2: match found <<< 26764 1726882725.31247: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882725.31320: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882725.31333: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882725.31342: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882725.31467: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882725.33253: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882725.33296: stderr chunk (state=3): >>><<< 26764 1726882725.33300: stdout chunk (state=3): >>><<< 26764 1726882725.33310: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882725.33316: handler run complete 26764 1726882725.33357: attempt loop complete, returning result 26764 1726882725.33360: _execute() done 26764 1726882725.33362: dumping result to json 26764 1726882725.33380: done dumping result, returning 26764 1726882725.33388: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0e448fcc-3ce9-9875-c9a3-00000000001b] 26764 1726882725.33392: sending task result for task 0e448fcc-3ce9-9875-c9a3-00000000001b ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 26764 1726882725.33856: done sending task result for task 0e448fcc-3ce9-9875-c9a3-00000000001b 26764 1726882725.33859: WORKER PROCESS EXITING 26764 1726882725.33867: no more pending results, returning what we have 26764 1726882725.33869: results queue empty 26764 1726882725.33870: checking for any_errors_fatal 26764 1726882725.33872: done checking for any_errors_fatal 26764 1726882725.33873: checking for max_fail_percentage 26764 1726882725.33876: done checking for max_fail_percentage 26764 1726882725.33877: checking to see if all hosts have failed and the running result is not ok 26764 1726882725.33878: done checking to see if all hosts have failed 26764 1726882725.33879: getting the remaining hosts for this loop 26764 1726882725.33880: done getting the remaining hosts for this loop 26764 1726882725.33883: getting the next task for host managed_node2 26764 1726882725.33887: done getting next task for host managed_node2 26764 1726882725.33889: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 26764 1726882725.33890: ^ state is: HOST STATE: block=3, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882725.33897: getting variables 26764 1726882725.33898: in VariableManager get_vars() 26764 1726882725.33920: Calling all_inventory to load vars for managed_node2 26764 1726882725.33921: Calling groups_inventory to load vars for managed_node2 26764 1726882725.33923: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882725.33929: Calling all_plugins_play to load vars for managed_node2 26764 1726882725.33931: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882725.33933: Calling groups_plugins_play to load vars for managed_node2 26764 1726882725.34761: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882725.36296: done with get_vars() 26764 1726882725.36310: done getting variables 26764 1726882725.36351: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:38:45 -0400 (0:00:01.002) 0:00:11.305 ****** 26764 1726882725.36379: entering _queue_task() for managed_node2/service 26764 1726882725.36580: worker is 1 (out of 1 available) 26764 1726882725.36593: exiting _queue_task() for managed_node2/service 26764 1726882725.36605: done queuing things up, now waiting for results queue to drain 26764 1726882725.36606: waiting for pending results... 26764 1726882725.36771: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 26764 1726882725.36905: in run() - task 0e448fcc-3ce9-9875-c9a3-00000000001c 26764 1726882725.36930: variable 'ansible_search_path' from source: unknown 26764 1726882725.36936: variable 'ansible_search_path' from source: unknown 26764 1726882725.36980: calling self._execute() 26764 1726882725.37053: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882725.37058: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882725.37077: variable 'omit' from source: magic vars 26764 1726882725.37465: variable 'ansible_distribution_major_version' from source: facts 26764 1726882725.37480: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882725.37615: variable 'network_provider' from source: set_fact 26764 1726882725.37625: Evaluated conditional (network_provider == "nm"): True 26764 1726882725.37728: variable '__network_wpa_supplicant_required' from source: role '' defaults 26764 1726882725.37847: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 26764 1726882725.38013: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 26764 1726882725.40137: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 26764 1726882725.40208: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 26764 1726882725.40249: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 26764 1726882725.40292: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 26764 1726882725.40330: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 26764 1726882725.40572: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26764 1726882725.40594: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26764 1726882725.40616: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26764 1726882725.40642: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26764 1726882725.40653: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26764 1726882725.40691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26764 1726882725.40713: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26764 1726882725.40727: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26764 1726882725.40752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26764 1726882725.40763: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26764 1726882725.40794: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26764 1726882725.40824: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26764 1726882725.40848: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26764 1726882725.40889: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26764 1726882725.40912: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26764 1726882725.41050: variable 'network_connections' from source: role '' defaults 26764 1726882725.41091: variable '__network_wireless_connections_defined' from source: role '' defaults 26764 1726882725.41260: variable 'network_connections' from source: role '' defaults 26764 1726882725.41282: Evaluated conditional (__network_wpa_supplicant_required): False 26764 1726882725.41286: when evaluation is False, skipping this task 26764 1726882725.41288: _execute() done 26764 1726882725.41290: dumping result to json 26764 1726882725.41293: done dumping result, returning 26764 1726882725.41300: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0e448fcc-3ce9-9875-c9a3-00000000001c] 26764 1726882725.41304: sending task result for task 0e448fcc-3ce9-9875-c9a3-00000000001c 26764 1726882725.41409: done sending task result for task 0e448fcc-3ce9-9875-c9a3-00000000001c 26764 1726882725.41412: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 26764 1726882725.41452: no more pending results, returning what we have 26764 1726882725.41455: results queue empty 26764 1726882725.41457: checking for any_errors_fatal 26764 1726882725.41478: done checking for any_errors_fatal 26764 1726882725.41479: checking for max_fail_percentage 26764 1726882725.41481: done checking for max_fail_percentage 26764 1726882725.41482: checking to see if all hosts have failed and the running result is not ok 26764 1726882725.41483: done checking to see if all hosts have failed 26764 1726882725.41483: getting the remaining hosts for this loop 26764 1726882725.41485: done getting the remaining hosts for this loop 26764 1726882725.41488: getting the next task for host managed_node2 26764 1726882725.41495: done getting next task for host managed_node2 26764 1726882725.41499: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 26764 1726882725.41501: ^ state is: HOST STATE: block=3, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882725.41512: getting variables 26764 1726882725.41514: in VariableManager get_vars() 26764 1726882725.41544: Calling all_inventory to load vars for managed_node2 26764 1726882725.41550: Calling groups_inventory to load vars for managed_node2 26764 1726882725.41552: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882725.41560: Calling all_plugins_play to load vars for managed_node2 26764 1726882725.41562: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882725.41567: Calling groups_plugins_play to load vars for managed_node2 26764 1726882725.42483: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882725.43419: done with get_vars() 26764 1726882725.43433: done getting variables 26764 1726882725.43478: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:38:45 -0400 (0:00:00.071) 0:00:11.376 ****** 26764 1726882725.43497: entering _queue_task() for managed_node2/service 26764 1726882725.43685: worker is 1 (out of 1 available) 26764 1726882725.43697: exiting _queue_task() for managed_node2/service 26764 1726882725.43708: done queuing things up, now waiting for results queue to drain 26764 1726882725.43709: waiting for pending results... 26764 1726882725.43874: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 26764 1726882725.43941: in run() - task 0e448fcc-3ce9-9875-c9a3-00000000001d 26764 1726882725.43952: variable 'ansible_search_path' from source: unknown 26764 1726882725.43955: variable 'ansible_search_path' from source: unknown 26764 1726882725.44002: calling self._execute() 26764 1726882725.44067: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882725.44075: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882725.44084: variable 'omit' from source: magic vars 26764 1726882725.44345: variable 'ansible_distribution_major_version' from source: facts 26764 1726882725.44355: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882725.44438: variable 'network_provider' from source: set_fact 26764 1726882725.44441: Evaluated conditional (network_provider == "initscripts"): False 26764 1726882725.44444: when evaluation is False, skipping this task 26764 1726882725.44446: _execute() done 26764 1726882725.44449: dumping result to json 26764 1726882725.44452: done dumping result, returning 26764 1726882725.44458: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [0e448fcc-3ce9-9875-c9a3-00000000001d] 26764 1726882725.44465: sending task result for task 0e448fcc-3ce9-9875-c9a3-00000000001d 26764 1726882725.44548: done sending task result for task 0e448fcc-3ce9-9875-c9a3-00000000001d 26764 1726882725.44550: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 26764 1726882725.44612: no more pending results, returning what we have 26764 1726882725.44615: results queue empty 26764 1726882725.44616: checking for any_errors_fatal 26764 1726882725.44620: done checking for any_errors_fatal 26764 1726882725.44621: checking for max_fail_percentage 26764 1726882725.44622: done checking for max_fail_percentage 26764 1726882725.44623: checking to see if all hosts have failed and the running result is not ok 26764 1726882725.44624: done checking to see if all hosts have failed 26764 1726882725.44625: getting the remaining hosts for this loop 26764 1726882725.44626: done getting the remaining hosts for this loop 26764 1726882725.44628: getting the next task for host managed_node2 26764 1726882725.44633: done getting next task for host managed_node2 26764 1726882725.44637: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 26764 1726882725.44639: ^ state is: HOST STATE: block=3, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882725.44651: getting variables 26764 1726882725.44653: in VariableManager get_vars() 26764 1726882725.44686: Calling all_inventory to load vars for managed_node2 26764 1726882725.44688: Calling groups_inventory to load vars for managed_node2 26764 1726882725.44689: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882725.44695: Calling all_plugins_play to load vars for managed_node2 26764 1726882725.44697: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882725.44699: Calling groups_plugins_play to load vars for managed_node2 26764 1726882725.45456: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882725.46794: done with get_vars() 26764 1726882725.46807: done getting variables 26764 1726882725.46844: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:38:45 -0400 (0:00:00.033) 0:00:11.410 ****** 26764 1726882725.46863: entering _queue_task() for managed_node2/copy 26764 1726882725.47037: worker is 1 (out of 1 available) 26764 1726882725.47048: exiting _queue_task() for managed_node2/copy 26764 1726882725.47065: done queuing things up, now waiting for results queue to drain 26764 1726882725.47067: waiting for pending results... 26764 1726882725.47332: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 26764 1726882725.47387: in run() - task 0e448fcc-3ce9-9875-c9a3-00000000001e 26764 1726882725.47411: variable 'ansible_search_path' from source: unknown 26764 1726882725.47414: variable 'ansible_search_path' from source: unknown 26764 1726882725.47497: calling self._execute() 26764 1726882725.47538: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882725.47542: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882725.47577: variable 'omit' from source: magic vars 26764 1726882725.47878: variable 'ansible_distribution_major_version' from source: facts 26764 1726882725.47916: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882725.48018: variable 'network_provider' from source: set_fact 26764 1726882725.48022: Evaluated conditional (network_provider == "initscripts"): False 26764 1726882725.48025: when evaluation is False, skipping this task 26764 1726882725.48027: _execute() done 26764 1726882725.48030: dumping result to json 26764 1726882725.48032: done dumping result, returning 26764 1726882725.48043: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0e448fcc-3ce9-9875-c9a3-00000000001e] 26764 1726882725.48046: sending task result for task 0e448fcc-3ce9-9875-c9a3-00000000001e 26764 1726882725.48142: done sending task result for task 0e448fcc-3ce9-9875-c9a3-00000000001e 26764 1726882725.48145: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 26764 1726882725.48219: no more pending results, returning what we have 26764 1726882725.48222: results queue empty 26764 1726882725.48223: checking for any_errors_fatal 26764 1726882725.48227: done checking for any_errors_fatal 26764 1726882725.48228: checking for max_fail_percentage 26764 1726882725.48229: done checking for max_fail_percentage 26764 1726882725.48230: checking to see if all hosts have failed and the running result is not ok 26764 1726882725.48231: done checking to see if all hosts have failed 26764 1726882725.48231: getting the remaining hosts for this loop 26764 1726882725.48232: done getting the remaining hosts for this loop 26764 1726882725.48235: getting the next task for host managed_node2 26764 1726882725.48239: done getting next task for host managed_node2 26764 1726882725.48242: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 26764 1726882725.48243: ^ state is: HOST STATE: block=3, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882725.48251: getting variables 26764 1726882725.48252: in VariableManager get_vars() 26764 1726882725.48284: Calling all_inventory to load vars for managed_node2 26764 1726882725.48286: Calling groups_inventory to load vars for managed_node2 26764 1726882725.48287: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882725.48293: Calling all_plugins_play to load vars for managed_node2 26764 1726882725.48295: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882725.48296: Calling groups_plugins_play to load vars for managed_node2 26764 1726882725.49425: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882725.50455: done with get_vars() 26764 1726882725.50472: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:38:45 -0400 (0:00:00.036) 0:00:11.446 ****** 26764 1726882725.50523: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 26764 1726882725.50524: Creating lock for fedora.linux_system_roles.network_connections 26764 1726882725.50690: worker is 1 (out of 1 available) 26764 1726882725.50702: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 26764 1726882725.50714: done queuing things up, now waiting for results queue to drain 26764 1726882725.50715: waiting for pending results... 26764 1726882725.50884: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 26764 1726882725.50953: in run() - task 0e448fcc-3ce9-9875-c9a3-00000000001f 26764 1726882725.50965: variable 'ansible_search_path' from source: unknown 26764 1726882725.50969: variable 'ansible_search_path' from source: unknown 26764 1726882725.51000: calling self._execute() 26764 1726882725.51054: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882725.51058: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882725.51071: variable 'omit' from source: magic vars 26764 1726882725.51310: variable 'ansible_distribution_major_version' from source: facts 26764 1726882725.51321: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882725.51326: variable 'omit' from source: magic vars 26764 1726882725.51351: variable 'omit' from source: magic vars 26764 1726882725.51458: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 26764 1726882725.53174: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 26764 1726882725.53218: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 26764 1726882725.53245: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 26764 1726882725.53272: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 26764 1726882725.53294: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 26764 1726882725.53350: variable 'network_provider' from source: set_fact 26764 1726882725.53439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26764 1726882725.53471: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26764 1726882725.53491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26764 1726882725.53518: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26764 1726882725.53530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26764 1726882725.53583: variable 'omit' from source: magic vars 26764 1726882725.53658: variable 'omit' from source: magic vars 26764 1726882725.53748: variable 'network_connections' from source: role '' defaults 26764 1726882725.53845: variable 'omit' from source: magic vars 26764 1726882725.53857: variable '__lsr_ansible_managed' from source: task vars 26764 1726882725.53924: variable '__lsr_ansible_managed' from source: task vars 26764 1726882725.54080: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 26764 1726882725.54250: Loaded config def from plugin (lookup/template) 26764 1726882725.54253: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 26764 1726882725.54277: File lookup term: get_ansible_managed.j2 26764 1726882725.54280: variable 'ansible_search_path' from source: unknown 26764 1726882725.54283: evaluation_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 26764 1726882725.54313: search_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 26764 1726882725.54316: variable 'ansible_search_path' from source: unknown 26764 1726882725.58515: variable 'ansible_managed' from source: unknown 26764 1726882725.58614: variable 'omit' from source: magic vars 26764 1726882725.58646: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26764 1726882725.58674: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26764 1726882725.58690: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26764 1726882725.58703: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882725.58711: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882725.58733: variable 'inventory_hostname' from source: host vars for 'managed_node2' 26764 1726882725.58736: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882725.58738: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882725.58804: Set connection var ansible_shell_executable to /bin/sh 26764 1726882725.58807: Set connection var ansible_shell_type to sh 26764 1726882725.58815: Set connection var ansible_module_compression to ZIP_DEFLATED 26764 1726882725.58819: Set connection var ansible_timeout to 10 26764 1726882725.58824: Set connection var ansible_connection to ssh 26764 1726882725.58829: Set connection var ansible_pipelining to False 26764 1726882725.58843: variable 'ansible_shell_executable' from source: unknown 26764 1726882725.58847: variable 'ansible_connection' from source: unknown 26764 1726882725.58850: variable 'ansible_module_compression' from source: unknown 26764 1726882725.58852: variable 'ansible_shell_type' from source: unknown 26764 1726882725.58854: variable 'ansible_shell_executable' from source: unknown 26764 1726882725.58856: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882725.58859: variable 'ansible_pipelining' from source: unknown 26764 1726882725.58863: variable 'ansible_timeout' from source: unknown 26764 1726882725.58881: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882725.58959: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 26764 1726882725.58968: variable 'omit' from source: magic vars 26764 1726882725.58982: starting attempt loop 26764 1726882725.58985: running the handler 26764 1726882725.58994: _low_level_execute_command(): starting 26764 1726882725.58997: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 26764 1726882725.59657: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26764 1726882725.59665: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882725.59685: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882725.59698: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882725.59745: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882725.59820: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882725.59823: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882725.59826: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882725.59830: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882725.59834: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882725.59956: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882725.61607: stdout chunk (state=3): >>>/root <<< 26764 1726882725.61706: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882725.61754: stderr chunk (state=3): >>><<< 26764 1726882725.61757: stdout chunk (state=3): >>><<< 26764 1726882725.61776: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882725.61785: _low_level_execute_command(): starting 26764 1726882725.61790: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882725.6177545-27298-82921567220203 `" && echo ansible-tmp-1726882725.6177545-27298-82921567220203="` echo /root/.ansible/tmp/ansible-tmp-1726882725.6177545-27298-82921567220203 `" ) && sleep 0' 26764 1726882725.62217: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882725.62223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882725.62249: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882725.62256: stderr chunk (state=3): >>>debug2: match not found <<< 26764 1726882725.62281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882725.62292: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882725.62298: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882725.62310: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26764 1726882725.62320: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882725.62325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882725.62378: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882725.62390: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882725.62395: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882725.62508: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882725.64391: stdout chunk (state=3): >>>ansible-tmp-1726882725.6177545-27298-82921567220203=/root/.ansible/tmp/ansible-tmp-1726882725.6177545-27298-82921567220203 <<< 26764 1726882725.64506: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882725.64558: stderr chunk (state=3): >>><<< 26764 1726882725.64576: stdout chunk (state=3): >>><<< 26764 1726882725.64593: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882725.6177545-27298-82921567220203=/root/.ansible/tmp/ansible-tmp-1726882725.6177545-27298-82921567220203 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882725.64625: variable 'ansible_module_compression' from source: unknown 26764 1726882725.64659: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 26764 1726882725.64662: ANSIBALLZ: Acquiring lock 26764 1726882725.64680: ANSIBALLZ: Lock acquired: 140693693914944 26764 1726882725.64683: ANSIBALLZ: Creating module 26764 1726882725.80162: ANSIBALLZ: Writing module into payload 26764 1726882725.80503: ANSIBALLZ: Writing module 26764 1726882725.80527: ANSIBALLZ: Renaming module 26764 1726882725.80532: ANSIBALLZ: Done creating module 26764 1726882725.80551: variable 'ansible_facts' from source: unknown 26764 1726882725.80617: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882725.6177545-27298-82921567220203/AnsiballZ_network_connections.py 26764 1726882725.80724: Sending initial data 26764 1726882725.80729: Sent initial data (167 bytes) 26764 1726882725.81429: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882725.81435: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882725.81468: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882725.81490: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 26764 1726882725.81495: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882725.81543: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882725.81554: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882725.81672: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882725.83507: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 26764 1726882725.83605: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 26764 1726882725.83703: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-26764trh16hvb/tmpf_3hy5m8 /root/.ansible/tmp/ansible-tmp-1726882725.6177545-27298-82921567220203/AnsiballZ_network_connections.py <<< 26764 1726882725.83796: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 26764 1726882725.85173: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882725.85268: stderr chunk (state=3): >>><<< 26764 1726882725.85274: stdout chunk (state=3): >>><<< 26764 1726882725.85293: done transferring module to remote 26764 1726882725.85302: _low_level_execute_command(): starting 26764 1726882725.85307: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882725.6177545-27298-82921567220203/ /root/.ansible/tmp/ansible-tmp-1726882725.6177545-27298-82921567220203/AnsiballZ_network_connections.py && sleep 0' 26764 1726882725.85751: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882725.85754: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882725.85791: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 26764 1726882725.85794: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 26764 1726882725.85797: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882725.85799: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882725.85849: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882725.85853: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882725.85960: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882725.87727: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882725.87780: stderr chunk (state=3): >>><<< 26764 1726882725.87783: stdout chunk (state=3): >>><<< 26764 1726882725.87796: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882725.87799: _low_level_execute_command(): starting 26764 1726882725.87804: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882725.6177545-27298-82921567220203/AnsiballZ_network_connections.py && sleep 0' 26764 1726882725.88229: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882725.88234: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882725.88270: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882725.88284: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882725.88343: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882725.88353: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882725.88455: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882726.09927: stdout chunk (state=3): >>> {"changed": false, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 26764 1726882726.11372: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 26764 1726882726.11453: stderr chunk (state=3): >>><<< 26764 1726882726.11456: stdout chunk (state=3): >>><<< 26764 1726882726.11606: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 26764 1726882726.11610: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882725.6177545-27298-82921567220203/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 26764 1726882726.11613: _low_level_execute_command(): starting 26764 1726882726.11616: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882725.6177545-27298-82921567220203/ > /dev/null 2>&1 && sleep 0' 26764 1726882726.12096: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882726.12100: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882726.12148: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882726.12151: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882726.12154: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882726.12220: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882726.12222: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882726.12224: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882726.12317: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882726.14109: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882726.14152: stderr chunk (state=3): >>><<< 26764 1726882726.14155: stdout chunk (state=3): >>><<< 26764 1726882726.14171: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882726.14179: handler run complete 26764 1726882726.14199: attempt loop complete, returning result 26764 1726882726.14202: _execute() done 26764 1726882726.14204: dumping result to json 26764 1726882726.14208: done dumping result, returning 26764 1726882726.14216: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0e448fcc-3ce9-9875-c9a3-00000000001f] 26764 1726882726.14221: sending task result for task 0e448fcc-3ce9-9875-c9a3-00000000001f 26764 1726882726.14320: done sending task result for task 0e448fcc-3ce9-9875-c9a3-00000000001f 26764 1726882726.14322: WORKER PROCESS EXITING ok: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false } STDERR: 26764 1726882726.14399: no more pending results, returning what we have 26764 1726882726.14403: results queue empty 26764 1726882726.14404: checking for any_errors_fatal 26764 1726882726.14414: done checking for any_errors_fatal 26764 1726882726.14415: checking for max_fail_percentage 26764 1726882726.14417: done checking for max_fail_percentage 26764 1726882726.14418: checking to see if all hosts have failed and the running result is not ok 26764 1726882726.14418: done checking to see if all hosts have failed 26764 1726882726.14419: getting the remaining hosts for this loop 26764 1726882726.14420: done getting the remaining hosts for this loop 26764 1726882726.14424: getting the next task for host managed_node2 26764 1726882726.14429: done getting next task for host managed_node2 26764 1726882726.14433: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 26764 1726882726.14435: ^ state is: HOST STATE: block=3, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882726.14444: getting variables 26764 1726882726.14446: in VariableManager get_vars() 26764 1726882726.14485: Calling all_inventory to load vars for managed_node2 26764 1726882726.14488: Calling groups_inventory to load vars for managed_node2 26764 1726882726.14491: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882726.14501: Calling all_plugins_play to load vars for managed_node2 26764 1726882726.14503: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882726.14506: Calling groups_plugins_play to load vars for managed_node2 26764 1726882726.15373: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882726.16406: done with get_vars() 26764 1726882726.16421: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:38:46 -0400 (0:00:00.659) 0:00:12.106 ****** 26764 1726882726.16482: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 26764 1726882726.16483: Creating lock for fedora.linux_system_roles.network_state 26764 1726882726.16700: worker is 1 (out of 1 available) 26764 1726882726.16714: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 26764 1726882726.16726: done queuing things up, now waiting for results queue to drain 26764 1726882726.16727: waiting for pending results... 26764 1726882726.16892: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 26764 1726882726.16953: in run() - task 0e448fcc-3ce9-9875-c9a3-000000000020 26764 1726882726.16976: variable 'ansible_search_path' from source: unknown 26764 1726882726.16979: variable 'ansible_search_path' from source: unknown 26764 1726882726.17007: calling self._execute() 26764 1726882726.17081: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882726.17085: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882726.17093: variable 'omit' from source: magic vars 26764 1726882726.17355: variable 'ansible_distribution_major_version' from source: facts 26764 1726882726.17366: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882726.17519: variable 'network_state' from source: role '' defaults 26764 1726882726.17549: Evaluated conditional (network_state != {}): False 26764 1726882726.17560: when evaluation is False, skipping this task 26764 1726882726.17580: _execute() done 26764 1726882726.17594: dumping result to json 26764 1726882726.17609: done dumping result, returning 26764 1726882726.17631: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [0e448fcc-3ce9-9875-c9a3-000000000020] 26764 1726882726.17646: sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000020 26764 1726882726.17788: done sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000020 26764 1726882726.17805: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 26764 1726882726.17942: no more pending results, returning what we have 26764 1726882726.17946: results queue empty 26764 1726882726.17947: checking for any_errors_fatal 26764 1726882726.17956: done checking for any_errors_fatal 26764 1726882726.17957: checking for max_fail_percentage 26764 1726882726.17958: done checking for max_fail_percentage 26764 1726882726.17959: checking to see if all hosts have failed and the running result is not ok 26764 1726882726.17960: done checking to see if all hosts have failed 26764 1726882726.17960: getting the remaining hosts for this loop 26764 1726882726.17962: done getting the remaining hosts for this loop 26764 1726882726.17967: getting the next task for host managed_node2 26764 1726882726.17973: done getting next task for host managed_node2 26764 1726882726.17976: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 26764 1726882726.17978: ^ state is: HOST STATE: block=3, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882726.17994: getting variables 26764 1726882726.17996: in VariableManager get_vars() 26764 1726882726.18029: Calling all_inventory to load vars for managed_node2 26764 1726882726.18032: Calling groups_inventory to load vars for managed_node2 26764 1726882726.18034: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882726.18042: Calling all_plugins_play to load vars for managed_node2 26764 1726882726.18043: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882726.18045: Calling groups_plugins_play to load vars for managed_node2 26764 1726882726.19992: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882726.21870: done with get_vars() 26764 1726882726.21903: done getting variables 26764 1726882726.21991: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:38:46 -0400 (0:00:00.055) 0:00:12.161 ****** 26764 1726882726.22034: entering _queue_task() for managed_node2/debug 26764 1726882726.22382: worker is 1 (out of 1 available) 26764 1726882726.22401: exiting _queue_task() for managed_node2/debug 26764 1726882726.22413: done queuing things up, now waiting for results queue to drain 26764 1726882726.22415: waiting for pending results... 26764 1726882726.22704: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 26764 1726882726.22914: in run() - task 0e448fcc-3ce9-9875-c9a3-000000000021 26764 1726882726.22956: variable 'ansible_search_path' from source: unknown 26764 1726882726.22970: variable 'ansible_search_path' from source: unknown 26764 1726882726.23011: calling self._execute() 26764 1726882726.23106: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882726.23110: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882726.23121: variable 'omit' from source: magic vars 26764 1726882726.23384: variable 'ansible_distribution_major_version' from source: facts 26764 1726882726.23394: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882726.23400: variable 'omit' from source: magic vars 26764 1726882726.23428: variable 'omit' from source: magic vars 26764 1726882726.23453: variable 'omit' from source: magic vars 26764 1726882726.23493: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26764 1726882726.23525: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26764 1726882726.23554: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26764 1726882726.23578: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882726.23594: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882726.23624: variable 'inventory_hostname' from source: host vars for 'managed_node2' 26764 1726882726.23631: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882726.23639: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882726.23736: Set connection var ansible_shell_executable to /bin/sh 26764 1726882726.23744: Set connection var ansible_shell_type to sh 26764 1726882726.23758: Set connection var ansible_module_compression to ZIP_DEFLATED 26764 1726882726.23771: Set connection var ansible_timeout to 10 26764 1726882726.23779: Set connection var ansible_connection to ssh 26764 1726882726.23786: Set connection var ansible_pipelining to False 26764 1726882726.23809: variable 'ansible_shell_executable' from source: unknown 26764 1726882726.23815: variable 'ansible_connection' from source: unknown 26764 1726882726.23820: variable 'ansible_module_compression' from source: unknown 26764 1726882726.23825: variable 'ansible_shell_type' from source: unknown 26764 1726882726.23831: variable 'ansible_shell_executable' from source: unknown 26764 1726882726.23838: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882726.23849: variable 'ansible_pipelining' from source: unknown 26764 1726882726.23856: variable 'ansible_timeout' from source: unknown 26764 1726882726.23869: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882726.24009: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 26764 1726882726.24024: variable 'omit' from source: magic vars 26764 1726882726.24032: starting attempt loop 26764 1726882726.24038: running the handler 26764 1726882726.24161: variable '__network_connections_result' from source: set_fact 26764 1726882726.24216: handler run complete 26764 1726882726.24236: attempt loop complete, returning result 26764 1726882726.24243: _execute() done 26764 1726882726.24248: dumping result to json 26764 1726882726.24255: done dumping result, returning 26764 1726882726.24269: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0e448fcc-3ce9-9875-c9a3-000000000021] 26764 1726882726.24279: sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000021 26764 1726882726.24374: done sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000021 26764 1726882726.24381: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "" ] } 26764 1726882726.24451: no more pending results, returning what we have 26764 1726882726.24453: results queue empty 26764 1726882726.24454: checking for any_errors_fatal 26764 1726882726.24461: done checking for any_errors_fatal 26764 1726882726.24461: checking for max_fail_percentage 26764 1726882726.24465: done checking for max_fail_percentage 26764 1726882726.24466: checking to see if all hosts have failed and the running result is not ok 26764 1726882726.24467: done checking to see if all hosts have failed 26764 1726882726.24468: getting the remaining hosts for this loop 26764 1726882726.24469: done getting the remaining hosts for this loop 26764 1726882726.24472: getting the next task for host managed_node2 26764 1726882726.24479: done getting next task for host managed_node2 26764 1726882726.24482: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 26764 1726882726.24484: ^ state is: HOST STATE: block=3, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882726.24492: getting variables 26764 1726882726.24494: in VariableManager get_vars() 26764 1726882726.24535: Calling all_inventory to load vars for managed_node2 26764 1726882726.24538: Calling groups_inventory to load vars for managed_node2 26764 1726882726.24540: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882726.24549: Calling all_plugins_play to load vars for managed_node2 26764 1726882726.24552: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882726.24554: Calling groups_plugins_play to load vars for managed_node2 26764 1726882726.26628: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882726.28465: done with get_vars() 26764 1726882726.28491: done getting variables 26764 1726882726.28567: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:38:46 -0400 (0:00:00.065) 0:00:12.227 ****** 26764 1726882726.28597: entering _queue_task() for managed_node2/debug 26764 1726882726.28885: worker is 1 (out of 1 available) 26764 1726882726.28899: exiting _queue_task() for managed_node2/debug 26764 1726882726.28912: done queuing things up, now waiting for results queue to drain 26764 1726882726.28913: waiting for pending results... 26764 1726882726.29091: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 26764 1726882726.29159: in run() - task 0e448fcc-3ce9-9875-c9a3-000000000022 26764 1726882726.29174: variable 'ansible_search_path' from source: unknown 26764 1726882726.29178: variable 'ansible_search_path' from source: unknown 26764 1726882726.29211: calling self._execute() 26764 1726882726.29287: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882726.29296: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882726.29306: variable 'omit' from source: magic vars 26764 1726882726.29570: variable 'ansible_distribution_major_version' from source: facts 26764 1726882726.29579: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882726.29585: variable 'omit' from source: magic vars 26764 1726882726.29615: variable 'omit' from source: magic vars 26764 1726882726.29639: variable 'omit' from source: magic vars 26764 1726882726.29673: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26764 1726882726.29700: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26764 1726882726.29716: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26764 1726882726.29732: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882726.29741: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882726.29767: variable 'inventory_hostname' from source: host vars for 'managed_node2' 26764 1726882726.29770: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882726.29772: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882726.29840: Set connection var ansible_shell_executable to /bin/sh 26764 1726882726.29843: Set connection var ansible_shell_type to sh 26764 1726882726.29851: Set connection var ansible_module_compression to ZIP_DEFLATED 26764 1726882726.29856: Set connection var ansible_timeout to 10 26764 1726882726.29861: Set connection var ansible_connection to ssh 26764 1726882726.29869: Set connection var ansible_pipelining to False 26764 1726882726.29885: variable 'ansible_shell_executable' from source: unknown 26764 1726882726.29888: variable 'ansible_connection' from source: unknown 26764 1726882726.29891: variable 'ansible_module_compression' from source: unknown 26764 1726882726.29893: variable 'ansible_shell_type' from source: unknown 26764 1726882726.29897: variable 'ansible_shell_executable' from source: unknown 26764 1726882726.29899: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882726.29901: variable 'ansible_pipelining' from source: unknown 26764 1726882726.29903: variable 'ansible_timeout' from source: unknown 26764 1726882726.29909: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882726.30008: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 26764 1726882726.30019: variable 'omit' from source: magic vars 26764 1726882726.30024: starting attempt loop 26764 1726882726.30027: running the handler 26764 1726882726.30069: variable '__network_connections_result' from source: set_fact 26764 1726882726.30118: variable '__network_connections_result' from source: set_fact 26764 1726882726.30187: handler run complete 26764 1726882726.30202: attempt loop complete, returning result 26764 1726882726.30205: _execute() done 26764 1726882726.30208: dumping result to json 26764 1726882726.30212: done dumping result, returning 26764 1726882726.30220: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0e448fcc-3ce9-9875-c9a3-000000000022] 26764 1726882726.30224: sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000022 26764 1726882726.30318: done sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000022 26764 1726882726.30320: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": false, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 26764 1726882726.30415: no more pending results, returning what we have 26764 1726882726.30418: results queue empty 26764 1726882726.30419: checking for any_errors_fatal 26764 1726882726.30424: done checking for any_errors_fatal 26764 1726882726.30425: checking for max_fail_percentage 26764 1726882726.30426: done checking for max_fail_percentage 26764 1726882726.30427: checking to see if all hosts have failed and the running result is not ok 26764 1726882726.30428: done checking to see if all hosts have failed 26764 1726882726.30429: getting the remaining hosts for this loop 26764 1726882726.30430: done getting the remaining hosts for this loop 26764 1726882726.30432: getting the next task for host managed_node2 26764 1726882726.30437: done getting next task for host managed_node2 26764 1726882726.30440: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 26764 1726882726.30442: ^ state is: HOST STATE: block=3, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882726.30450: getting variables 26764 1726882726.30451: in VariableManager get_vars() 26764 1726882726.30490: Calling all_inventory to load vars for managed_node2 26764 1726882726.30493: Calling groups_inventory to load vars for managed_node2 26764 1726882726.30494: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882726.30501: Calling all_plugins_play to load vars for managed_node2 26764 1726882726.30503: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882726.30504: Calling groups_plugins_play to load vars for managed_node2 26764 1726882726.31280: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882726.32956: done with get_vars() 26764 1726882726.32980: done getting variables 26764 1726882726.33034: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:38:46 -0400 (0:00:00.044) 0:00:12.272 ****** 26764 1726882726.33061: entering _queue_task() for managed_node2/debug 26764 1726882726.33318: worker is 1 (out of 1 available) 26764 1726882726.33330: exiting _queue_task() for managed_node2/debug 26764 1726882726.33341: done queuing things up, now waiting for results queue to drain 26764 1726882726.33342: waiting for pending results... 26764 1726882726.33600: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 26764 1726882726.33680: in run() - task 0e448fcc-3ce9-9875-c9a3-000000000023 26764 1726882726.33694: variable 'ansible_search_path' from source: unknown 26764 1726882726.33697: variable 'ansible_search_path' from source: unknown 26764 1726882726.33729: calling self._execute() 26764 1726882726.33816: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882726.33820: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882726.33831: variable 'omit' from source: magic vars 26764 1726882726.34189: variable 'ansible_distribution_major_version' from source: facts 26764 1726882726.34200: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882726.34322: variable 'network_state' from source: role '' defaults 26764 1726882726.34333: Evaluated conditional (network_state != {}): False 26764 1726882726.34336: when evaluation is False, skipping this task 26764 1726882726.34338: _execute() done 26764 1726882726.34341: dumping result to json 26764 1726882726.34344: done dumping result, returning 26764 1726882726.34353: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0e448fcc-3ce9-9875-c9a3-000000000023] 26764 1726882726.34358: sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000023 26764 1726882726.34453: done sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000023 skipping: [managed_node2] => { "false_condition": "network_state != {}" } 26764 1726882726.34500: no more pending results, returning what we have 26764 1726882726.34504: results queue empty 26764 1726882726.34505: checking for any_errors_fatal 26764 1726882726.34514: done checking for any_errors_fatal 26764 1726882726.34515: checking for max_fail_percentage 26764 1726882726.34516: done checking for max_fail_percentage 26764 1726882726.34518: checking to see if all hosts have failed and the running result is not ok 26764 1726882726.34518: done checking to see if all hosts have failed 26764 1726882726.34519: getting the remaining hosts for this loop 26764 1726882726.34520: done getting the remaining hosts for this loop 26764 1726882726.34524: getting the next task for host managed_node2 26764 1726882726.34531: done getting next task for host managed_node2 26764 1726882726.34535: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 26764 1726882726.34537: ^ state is: HOST STATE: block=3, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882726.34555: WORKER PROCESS EXITING 26764 1726882726.34560: getting variables 26764 1726882726.34562: in VariableManager get_vars() 26764 1726882726.34599: Calling all_inventory to load vars for managed_node2 26764 1726882726.34602: Calling groups_inventory to load vars for managed_node2 26764 1726882726.34605: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882726.34617: Calling all_plugins_play to load vars for managed_node2 26764 1726882726.34620: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882726.34623: Calling groups_plugins_play to load vars for managed_node2 26764 1726882726.36271: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882726.37946: done with get_vars() 26764 1726882726.37968: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:38:46 -0400 (0:00:00.049) 0:00:12.322 ****** 26764 1726882726.38051: entering _queue_task() for managed_node2/ping 26764 1726882726.38053: Creating lock for ping 26764 1726882726.38305: worker is 1 (out of 1 available) 26764 1726882726.38317: exiting _queue_task() for managed_node2/ping 26764 1726882726.38328: done queuing things up, now waiting for results queue to drain 26764 1726882726.38329: waiting for pending results... 26764 1726882726.38592: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 26764 1726882726.38671: in run() - task 0e448fcc-3ce9-9875-c9a3-000000000024 26764 1726882726.38683: variable 'ansible_search_path' from source: unknown 26764 1726882726.38687: variable 'ansible_search_path' from source: unknown 26764 1726882726.38719: calling self._execute() 26764 1726882726.38806: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882726.38809: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882726.38821: variable 'omit' from source: magic vars 26764 1726882726.39175: variable 'ansible_distribution_major_version' from source: facts 26764 1726882726.39186: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882726.39193: variable 'omit' from source: magic vars 26764 1726882726.39235: variable 'omit' from source: magic vars 26764 1726882726.39269: variable 'omit' from source: magic vars 26764 1726882726.39305: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26764 1726882726.39342: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26764 1726882726.39362: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26764 1726882726.39381: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882726.39393: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882726.39425: variable 'inventory_hostname' from source: host vars for 'managed_node2' 26764 1726882726.39428: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882726.39432: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882726.39535: Set connection var ansible_shell_executable to /bin/sh 26764 1726882726.39538: Set connection var ansible_shell_type to sh 26764 1726882726.39548: Set connection var ansible_module_compression to ZIP_DEFLATED 26764 1726882726.39553: Set connection var ansible_timeout to 10 26764 1726882726.39559: Set connection var ansible_connection to ssh 26764 1726882726.39567: Set connection var ansible_pipelining to False 26764 1726882726.39587: variable 'ansible_shell_executable' from source: unknown 26764 1726882726.39590: variable 'ansible_connection' from source: unknown 26764 1726882726.39593: variable 'ansible_module_compression' from source: unknown 26764 1726882726.39595: variable 'ansible_shell_type' from source: unknown 26764 1726882726.39597: variable 'ansible_shell_executable' from source: unknown 26764 1726882726.39599: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882726.39608: variable 'ansible_pipelining' from source: unknown 26764 1726882726.39610: variable 'ansible_timeout' from source: unknown 26764 1726882726.39612: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882726.39809: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 26764 1726882726.39818: variable 'omit' from source: magic vars 26764 1726882726.39822: starting attempt loop 26764 1726882726.39825: running the handler 26764 1726882726.39838: _low_level_execute_command(): starting 26764 1726882726.39845: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 26764 1726882726.40577: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26764 1726882726.40590: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882726.40599: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882726.40615: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882726.40657: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882726.40669: stderr chunk (state=3): >>>debug2: match not found <<< 26764 1726882726.40677: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882726.40691: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882726.40699: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882726.40708: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26764 1726882726.40714: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882726.40723: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882726.40738: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882726.40745: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882726.40752: stderr chunk (state=3): >>>debug2: match found <<< 26764 1726882726.40762: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882726.40835: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882726.40858: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882726.40875: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882726.41002: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882726.42651: stdout chunk (state=3): >>>/root <<< 26764 1726882726.42778: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882726.42823: stderr chunk (state=3): >>><<< 26764 1726882726.42826: stdout chunk (state=3): >>><<< 26764 1726882726.42849: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882726.42861: _low_level_execute_command(): starting 26764 1726882726.42870: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882726.4284756-27318-66155883878942 `" && echo ansible-tmp-1726882726.4284756-27318-66155883878942="` echo /root/.ansible/tmp/ansible-tmp-1726882726.4284756-27318-66155883878942 `" ) && sleep 0' 26764 1726882726.43472: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26764 1726882726.43482: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882726.43492: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882726.43505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882726.43543: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882726.43550: stderr chunk (state=3): >>>debug2: match not found <<< 26764 1726882726.43560: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882726.43576: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882726.43583: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882726.43590: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26764 1726882726.43597: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882726.43606: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882726.43618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882726.43624: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882726.43631: stderr chunk (state=3): >>>debug2: match found <<< 26764 1726882726.43639: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882726.43712: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882726.43725: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882726.43737: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882726.43870: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882726.45778: stdout chunk (state=3): >>>ansible-tmp-1726882726.4284756-27318-66155883878942=/root/.ansible/tmp/ansible-tmp-1726882726.4284756-27318-66155883878942 <<< 26764 1726882726.45951: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882726.45954: stdout chunk (state=3): >>><<< 26764 1726882726.45962: stderr chunk (state=3): >>><<< 26764 1726882726.45982: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882726.4284756-27318-66155883878942=/root/.ansible/tmp/ansible-tmp-1726882726.4284756-27318-66155883878942 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882726.46027: variable 'ansible_module_compression' from source: unknown 26764 1726882726.46068: ANSIBALLZ: Using lock for ping 26764 1726882726.46071: ANSIBALLZ: Acquiring lock 26764 1726882726.46074: ANSIBALLZ: Lock acquired: 140693692586976 26764 1726882726.46076: ANSIBALLZ: Creating module 26764 1726882726.58929: ANSIBALLZ: Writing module into payload 26764 1726882726.58992: ANSIBALLZ: Writing module 26764 1726882726.59011: ANSIBALLZ: Renaming module 26764 1726882726.59017: ANSIBALLZ: Done creating module 26764 1726882726.59033: variable 'ansible_facts' from source: unknown 26764 1726882726.59101: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882726.4284756-27318-66155883878942/AnsiballZ_ping.py 26764 1726882726.59237: Sending initial data 26764 1726882726.59240: Sent initial data (152 bytes) 26764 1726882726.60183: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26764 1726882726.60194: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882726.60204: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882726.60219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882726.60256: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882726.60267: stderr chunk (state=3): >>>debug2: match not found <<< 26764 1726882726.60276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882726.60292: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882726.60298: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882726.60305: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26764 1726882726.60312: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882726.60322: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882726.60333: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882726.60341: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882726.60347: stderr chunk (state=3): >>>debug2: match found <<< 26764 1726882726.60358: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882726.60430: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882726.60445: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882726.60448: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882726.61197: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882726.63053: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 26764 1726882726.63150: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 26764 1726882726.63253: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-26764trh16hvb/tmp_fj5aayz /root/.ansible/tmp/ansible-tmp-1726882726.4284756-27318-66155883878942/AnsiballZ_ping.py <<< 26764 1726882726.63350: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 26764 1726882726.64880: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882726.64954: stderr chunk (state=3): >>><<< 26764 1726882726.64957: stdout chunk (state=3): >>><<< 26764 1726882726.64979: done transferring module to remote 26764 1726882726.64991: _low_level_execute_command(): starting 26764 1726882726.64995: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882726.4284756-27318-66155883878942/ /root/.ansible/tmp/ansible-tmp-1726882726.4284756-27318-66155883878942/AnsiballZ_ping.py && sleep 0' 26764 1726882726.66306: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26764 1726882726.66979: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882726.66990: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882726.67004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882726.67043: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882726.67051: stderr chunk (state=3): >>>debug2: match not found <<< 26764 1726882726.67061: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882726.67080: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882726.67088: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882726.67094: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26764 1726882726.67103: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882726.67111: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882726.67123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882726.67130: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882726.67137: stderr chunk (state=3): >>>debug2: match found <<< 26764 1726882726.67147: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882726.67219: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882726.67233: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882726.67244: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882726.67371: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882726.69273: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882726.69276: stdout chunk (state=3): >>><<< 26764 1726882726.69278: stderr chunk (state=3): >>><<< 26764 1726882726.69298: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882726.69303: _low_level_execute_command(): starting 26764 1726882726.69306: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882726.4284756-27318-66155883878942/AnsiballZ_ping.py && sleep 0' 26764 1726882726.70374: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882726.70379: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882726.70402: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882726.70440: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882726.70445: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882726.70460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882726.70470: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882726.70552: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882726.70560: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882726.70576: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882726.70708: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882726.83704: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 26764 1726882726.84781: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 26764 1726882726.84785: stderr chunk (state=3): >>><<< 26764 1726882726.84787: stdout chunk (state=3): >>><<< 26764 1726882726.84807: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 26764 1726882726.84830: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882726.4284756-27318-66155883878942/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 26764 1726882726.84838: _low_level_execute_command(): starting 26764 1726882726.84845: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882726.4284756-27318-66155883878942/ > /dev/null 2>&1 && sleep 0' 26764 1726882726.86449: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26764 1726882726.86469: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882726.86485: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882726.86504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882726.86568: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882726.86582: stderr chunk (state=3): >>>debug2: match not found <<< 26764 1726882726.86597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882726.86614: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882726.86626: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882726.86637: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26764 1726882726.86656: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882726.86679: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882726.86695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882726.86706: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882726.86717: stderr chunk (state=3): >>>debug2: match found <<< 26764 1726882726.86730: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882726.86816: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882726.86842: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882726.86859: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882726.87003: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882726.88899: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882726.88902: stdout chunk (state=3): >>><<< 26764 1726882726.88904: stderr chunk (state=3): >>><<< 26764 1726882726.89178: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882726.89181: handler run complete 26764 1726882726.89184: attempt loop complete, returning result 26764 1726882726.89185: _execute() done 26764 1726882726.89187: dumping result to json 26764 1726882726.89189: done dumping result, returning 26764 1726882726.89191: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0e448fcc-3ce9-9875-c9a3-000000000024] 26764 1726882726.89193: sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000024 26764 1726882726.89259: done sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000024 26764 1726882726.89262: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 26764 1726882726.89336: no more pending results, returning what we have 26764 1726882726.89340: results queue empty 26764 1726882726.89341: checking for any_errors_fatal 26764 1726882726.89347: done checking for any_errors_fatal 26764 1726882726.89348: checking for max_fail_percentage 26764 1726882726.89350: done checking for max_fail_percentage 26764 1726882726.89351: checking to see if all hosts have failed and the running result is not ok 26764 1726882726.89351: done checking to see if all hosts have failed 26764 1726882726.89352: getting the remaining hosts for this loop 26764 1726882726.89354: done getting the remaining hosts for this loop 26764 1726882726.89357: getting the next task for host managed_node2 26764 1726882726.89370: done getting next task for host managed_node2 26764 1726882726.89373: ^ task is: TASK: meta (role_complete) 26764 1726882726.89375: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882726.89386: getting variables 26764 1726882726.89388: in VariableManager get_vars() 26764 1726882726.89428: Calling all_inventory to load vars for managed_node2 26764 1726882726.89431: Calling groups_inventory to load vars for managed_node2 26764 1726882726.89433: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882726.89444: Calling all_plugins_play to load vars for managed_node2 26764 1726882726.89447: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882726.89450: Calling groups_plugins_play to load vars for managed_node2 26764 1726882726.92237: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882726.95922: done with get_vars() 26764 1726882726.95947: done getting variables 26764 1726882726.96141: done queuing things up, now waiting for results queue to drain 26764 1726882726.96144: results queue empty 26764 1726882726.96145: checking for any_errors_fatal 26764 1726882726.96147: done checking for any_errors_fatal 26764 1726882726.96148: checking for max_fail_percentage 26764 1726882726.96149: done checking for max_fail_percentage 26764 1726882726.96150: checking to see if all hosts have failed and the running result is not ok 26764 1726882726.96151: done checking to see if all hosts have failed 26764 1726882726.96152: getting the remaining hosts for this loop 26764 1726882726.96153: done getting the remaining hosts for this loop 26764 1726882726.96155: getting the next task for host managed_node2 26764 1726882726.96159: done getting next task for host managed_node2 26764 1726882726.96162: ^ task is: TASK: Include network role 26764 1726882726.96165: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882726.96167: getting variables 26764 1726882726.96168: in VariableManager get_vars() 26764 1726882726.96181: Calling all_inventory to load vars for managed_node2 26764 1726882726.96183: Calling groups_inventory to load vars for managed_node2 26764 1726882726.96185: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882726.96275: Calling all_plugins_play to load vars for managed_node2 26764 1726882726.96278: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882726.96281: Calling groups_plugins_play to load vars for managed_node2 26764 1726882726.98816: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882727.02135: done with get_vars() 26764 1726882727.02279: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_reapply.yml:27 Friday 20 September 2024 21:38:47 -0400 (0:00:00.642) 0:00:12.965 ****** 26764 1726882727.02354: entering _queue_task() for managed_node2/include_role 26764 1726882727.02357: Creating lock for include_role 26764 1726882727.03157: worker is 1 (out of 1 available) 26764 1726882727.03177: exiting _queue_task() for managed_node2/include_role 26764 1726882727.03189: done queuing things up, now waiting for results queue to drain 26764 1726882727.03190: waiting for pending results... 26764 1726882727.05021: running TaskExecutor() for managed_node2/TASK: Include network role 26764 1726882727.05248: in run() - task 0e448fcc-3ce9-9875-c9a3-00000000002a 26764 1726882727.05331: variable 'ansible_search_path' from source: unknown 26764 1726882727.05373: calling self._execute() 26764 1726882727.05508: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882727.05648: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882727.05668: variable 'omit' from source: magic vars 26764 1726882727.06378: variable 'ansible_distribution_major_version' from source: facts 26764 1726882727.06400: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882727.06411: _execute() done 26764 1726882727.06514: dumping result to json 26764 1726882727.06523: done dumping result, returning 26764 1726882727.06533: done running TaskExecutor() for managed_node2/TASK: Include network role [0e448fcc-3ce9-9875-c9a3-00000000002a] 26764 1726882727.06542: sending task result for task 0e448fcc-3ce9-9875-c9a3-00000000002a 26764 1726882727.06701: no more pending results, returning what we have 26764 1726882727.06706: in VariableManager get_vars() 26764 1726882727.06748: Calling all_inventory to load vars for managed_node2 26764 1726882727.06751: Calling groups_inventory to load vars for managed_node2 26764 1726882727.06753: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882727.06771: Calling all_plugins_play to load vars for managed_node2 26764 1726882727.06774: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882727.06778: Calling groups_plugins_play to load vars for managed_node2 26764 1726882727.07297: done sending task result for task 0e448fcc-3ce9-9875-c9a3-00000000002a 26764 1726882727.07300: WORKER PROCESS EXITING 26764 1726882727.09384: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882727.12907: done with get_vars() 26764 1726882727.13049: variable 'ansible_search_path' from source: unknown 26764 1726882727.13542: variable 'omit' from source: magic vars 26764 1726882727.13677: variable 'omit' from source: magic vars 26764 1726882727.13695: variable 'omit' from source: magic vars 26764 1726882727.13700: we have included files to process 26764 1726882727.13701: generating all_blocks data 26764 1726882727.13703: done generating all_blocks data 26764 1726882727.13707: processing included file: fedora.linux_system_roles.network 26764 1726882727.13731: in VariableManager get_vars() 26764 1726882727.13750: done with get_vars() 26764 1726882727.13902: in VariableManager get_vars() 26764 1726882727.13922: done with get_vars() 26764 1726882727.13972: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 26764 1726882727.14323: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 26764 1726882727.14409: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 26764 1726882727.15524: in VariableManager get_vars() 26764 1726882727.15547: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 26764 1726882727.19397: iterating over new_blocks loaded from include file 26764 1726882727.19399: in VariableManager get_vars() 26764 1726882727.19418: done with get_vars() 26764 1726882727.19420: filtering new block on tags 26764 1726882727.19486: done filtering new block on tags 26764 1726882727.19490: in VariableManager get_vars() 26764 1726882727.19512: done with get_vars() 26764 1726882727.19515: filtering new block on tags 26764 1726882727.19532: done filtering new block on tags 26764 1726882727.19534: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node2 26764 1726882727.19540: extending task lists for all hosts with included blocks 26764 1726882727.19778: done extending task lists 26764 1726882727.19779: done processing included files 26764 1726882727.19780: results queue empty 26764 1726882727.19781: checking for any_errors_fatal 26764 1726882727.19782: done checking for any_errors_fatal 26764 1726882727.19783: checking for max_fail_percentage 26764 1726882727.19784: done checking for max_fail_percentage 26764 1726882727.19785: checking to see if all hosts have failed and the running result is not ok 26764 1726882727.19786: done checking to see if all hosts have failed 26764 1726882727.19787: getting the remaining hosts for this loop 26764 1726882727.19788: done getting the remaining hosts for this loop 26764 1726882727.19791: getting the next task for host managed_node2 26764 1726882727.19799: done getting next task for host managed_node2 26764 1726882727.19802: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 26764 1726882727.19804: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882727.19813: getting variables 26764 1726882727.19814: in VariableManager get_vars() 26764 1726882727.19833: Calling all_inventory to load vars for managed_node2 26764 1726882727.19835: Calling groups_inventory to load vars for managed_node2 26764 1726882727.19837: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882727.19843: Calling all_plugins_play to load vars for managed_node2 26764 1726882727.19845: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882727.19848: Calling groups_plugins_play to load vars for managed_node2 26764 1726882727.21275: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882727.23057: done with get_vars() 26764 1726882727.23081: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:38:47 -0400 (0:00:00.207) 0:00:13.173 ****** 26764 1726882727.23156: entering _queue_task() for managed_node2/include_tasks 26764 1726882727.23501: worker is 1 (out of 1 available) 26764 1726882727.23512: exiting _queue_task() for managed_node2/include_tasks 26764 1726882727.23527: done queuing things up, now waiting for results queue to drain 26764 1726882727.23528: waiting for pending results... 26764 1726882727.23817: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 26764 1726882727.23967: in run() - task 0e448fcc-3ce9-9875-c9a3-0000000001df 26764 1726882727.23989: variable 'ansible_search_path' from source: unknown 26764 1726882727.23996: variable 'ansible_search_path' from source: unknown 26764 1726882727.24030: calling self._execute() 26764 1726882727.24128: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882727.24140: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882727.24161: variable 'omit' from source: magic vars 26764 1726882727.24551: variable 'ansible_distribution_major_version' from source: facts 26764 1726882727.24573: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882727.24583: _execute() done 26764 1726882727.24594: dumping result to json 26764 1726882727.24603: done dumping result, returning 26764 1726882727.24614: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0e448fcc-3ce9-9875-c9a3-0000000001df] 26764 1726882727.24630: sending task result for task 0e448fcc-3ce9-9875-c9a3-0000000001df 26764 1726882727.24779: no more pending results, returning what we have 26764 1726882727.24786: in VariableManager get_vars() 26764 1726882727.24831: Calling all_inventory to load vars for managed_node2 26764 1726882727.24834: Calling groups_inventory to load vars for managed_node2 26764 1726882727.24839: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882727.24853: Calling all_plugins_play to load vars for managed_node2 26764 1726882727.24857: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882727.24860: Calling groups_plugins_play to load vars for managed_node2 26764 1726882727.26637: done sending task result for task 0e448fcc-3ce9-9875-c9a3-0000000001df 26764 1726882727.26641: WORKER PROCESS EXITING 26764 1726882727.28075: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882727.29837: done with get_vars() 26764 1726882727.29892: variable 'ansible_search_path' from source: unknown 26764 1726882727.29893: variable 'ansible_search_path' from source: unknown 26764 1726882727.29938: we have included files to process 26764 1726882727.29939: generating all_blocks data 26764 1726882727.29941: done generating all_blocks data 26764 1726882727.29944: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 26764 1726882727.29945: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 26764 1726882727.29948: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 26764 1726882727.32428: done processing included file 26764 1726882727.32433: iterating over new_blocks loaded from include file 26764 1726882727.32435: in VariableManager get_vars() 26764 1726882727.32460: done with get_vars() 26764 1726882727.32462: filtering new block on tags 26764 1726882727.32489: done filtering new block on tags 26764 1726882727.32492: in VariableManager get_vars() 26764 1726882727.32516: done with get_vars() 26764 1726882727.32518: filtering new block on tags 26764 1726882727.32538: done filtering new block on tags 26764 1726882727.32541: in VariableManager get_vars() 26764 1726882727.32661: done with get_vars() 26764 1726882727.32667: filtering new block on tags 26764 1726882727.32687: done filtering new block on tags 26764 1726882727.32690: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 26764 1726882727.32695: extending task lists for all hosts with included blocks 26764 1726882727.33847: done extending task lists 26764 1726882727.33848: done processing included files 26764 1726882727.33849: results queue empty 26764 1726882727.33850: checking for any_errors_fatal 26764 1726882727.33853: done checking for any_errors_fatal 26764 1726882727.33854: checking for max_fail_percentage 26764 1726882727.33855: done checking for max_fail_percentage 26764 1726882727.33855: checking to see if all hosts have failed and the running result is not ok 26764 1726882727.33856: done checking to see if all hosts have failed 26764 1726882727.33857: getting the remaining hosts for this loop 26764 1726882727.33858: done getting the remaining hosts for this loop 26764 1726882727.33860: getting the next task for host managed_node2 26764 1726882727.33867: done getting next task for host managed_node2 26764 1726882727.33870: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 26764 1726882727.33874: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882727.33883: getting variables 26764 1726882727.33884: in VariableManager get_vars() 26764 1726882727.33898: Calling all_inventory to load vars for managed_node2 26764 1726882727.33900: Calling groups_inventory to load vars for managed_node2 26764 1726882727.33902: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882727.33907: Calling all_plugins_play to load vars for managed_node2 26764 1726882727.33910: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882727.33912: Calling groups_plugins_play to load vars for managed_node2 26764 1726882727.35320: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882727.37186: done with get_vars() 26764 1726882727.37214: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:38:47 -0400 (0:00:00.141) 0:00:13.314 ****** 26764 1726882727.37285: entering _queue_task() for managed_node2/setup 26764 1726882727.37877: worker is 1 (out of 1 available) 26764 1726882727.37889: exiting _queue_task() for managed_node2/setup 26764 1726882727.37899: done queuing things up, now waiting for results queue to drain 26764 1726882727.37900: waiting for pending results... 26764 1726882727.38186: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 26764 1726882727.38342: in run() - task 0e448fcc-3ce9-9875-c9a3-000000000243 26764 1726882727.38359: variable 'ansible_search_path' from source: unknown 26764 1726882727.38363: variable 'ansible_search_path' from source: unknown 26764 1726882727.38399: calling self._execute() 26764 1726882727.38492: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882727.38496: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882727.38511: variable 'omit' from source: magic vars 26764 1726882727.38884: variable 'ansible_distribution_major_version' from source: facts 26764 1726882727.38899: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882727.39124: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 26764 1726882727.41584: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 26764 1726882727.41665: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 26764 1726882727.41704: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 26764 1726882727.41741: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 26764 1726882727.41772: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 26764 1726882727.41849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26764 1726882727.41885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26764 1726882727.41910: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26764 1726882727.41955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26764 1726882727.41972: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26764 1726882727.42029: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26764 1726882727.42048: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26764 1726882727.42081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26764 1726882727.42125: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26764 1726882727.42139: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26764 1726882727.42299: variable '__network_required_facts' from source: role '' defaults 26764 1726882727.42311: variable 'ansible_facts' from source: unknown 26764 1726882727.43106: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 26764 1726882727.43110: when evaluation is False, skipping this task 26764 1726882727.43112: _execute() done 26764 1726882727.43115: dumping result to json 26764 1726882727.43117: done dumping result, returning 26764 1726882727.43125: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0e448fcc-3ce9-9875-c9a3-000000000243] 26764 1726882727.43132: sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000243 26764 1726882727.43223: done sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000243 26764 1726882727.43226: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 26764 1726882727.43291: no more pending results, returning what we have 26764 1726882727.43295: results queue empty 26764 1726882727.43296: checking for any_errors_fatal 26764 1726882727.43298: done checking for any_errors_fatal 26764 1726882727.43299: checking for max_fail_percentage 26764 1726882727.43301: done checking for max_fail_percentage 26764 1726882727.43302: checking to see if all hosts have failed and the running result is not ok 26764 1726882727.43303: done checking to see if all hosts have failed 26764 1726882727.43303: getting the remaining hosts for this loop 26764 1726882727.43305: done getting the remaining hosts for this loop 26764 1726882727.43308: getting the next task for host managed_node2 26764 1726882727.43319: done getting next task for host managed_node2 26764 1726882727.43324: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 26764 1726882727.43329: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882727.43344: getting variables 26764 1726882727.43346: in VariableManager get_vars() 26764 1726882727.43390: Calling all_inventory to load vars for managed_node2 26764 1726882727.43393: Calling groups_inventory to load vars for managed_node2 26764 1726882727.43396: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882727.43407: Calling all_plugins_play to load vars for managed_node2 26764 1726882727.43411: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882727.43414: Calling groups_plugins_play to load vars for managed_node2 26764 1726882727.45187: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882727.50543: done with get_vars() 26764 1726882727.50569: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:38:47 -0400 (0:00:00.133) 0:00:13.448 ****** 26764 1726882727.50657: entering _queue_task() for managed_node2/stat 26764 1726882727.50982: worker is 1 (out of 1 available) 26764 1726882727.50994: exiting _queue_task() for managed_node2/stat 26764 1726882727.51007: done queuing things up, now waiting for results queue to drain 26764 1726882727.51008: waiting for pending results... 26764 1726882727.51293: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 26764 1726882727.51431: in run() - task 0e448fcc-3ce9-9875-c9a3-000000000245 26764 1726882727.51444: variable 'ansible_search_path' from source: unknown 26764 1726882727.51451: variable 'ansible_search_path' from source: unknown 26764 1726882727.51489: calling self._execute() 26764 1726882727.51583: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882727.51590: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882727.51600: variable 'omit' from source: magic vars 26764 1726882727.51981: variable 'ansible_distribution_major_version' from source: facts 26764 1726882727.51994: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882727.52159: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 26764 1726882727.52432: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 26764 1726882727.52478: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 26764 1726882727.52534: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 26764 1726882727.52572: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 26764 1726882727.52656: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 26764 1726882727.52690: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 26764 1726882727.52722: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 26764 1726882727.52746: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 26764 1726882727.52836: variable '__network_is_ostree' from source: set_fact 26764 1726882727.52842: Evaluated conditional (not __network_is_ostree is defined): False 26764 1726882727.52845: when evaluation is False, skipping this task 26764 1726882727.52848: _execute() done 26764 1726882727.52850: dumping result to json 26764 1726882727.52852: done dumping result, returning 26764 1726882727.52860: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0e448fcc-3ce9-9875-c9a3-000000000245] 26764 1726882727.52870: sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000245 26764 1726882727.52956: done sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000245 26764 1726882727.52959: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 26764 1726882727.53034: no more pending results, returning what we have 26764 1726882727.53038: results queue empty 26764 1726882727.53040: checking for any_errors_fatal 26764 1726882727.53047: done checking for any_errors_fatal 26764 1726882727.53048: checking for max_fail_percentage 26764 1726882727.53050: done checking for max_fail_percentage 26764 1726882727.53051: checking to see if all hosts have failed and the running result is not ok 26764 1726882727.53052: done checking to see if all hosts have failed 26764 1726882727.53053: getting the remaining hosts for this loop 26764 1726882727.53054: done getting the remaining hosts for this loop 26764 1726882727.53058: getting the next task for host managed_node2 26764 1726882727.53067: done getting next task for host managed_node2 26764 1726882727.53071: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 26764 1726882727.53075: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882727.53090: getting variables 26764 1726882727.53092: in VariableManager get_vars() 26764 1726882727.53134: Calling all_inventory to load vars for managed_node2 26764 1726882727.53136: Calling groups_inventory to load vars for managed_node2 26764 1726882727.53138: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882727.53150: Calling all_plugins_play to load vars for managed_node2 26764 1726882727.53153: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882727.53157: Calling groups_plugins_play to load vars for managed_node2 26764 1726882727.54709: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882727.56473: done with get_vars() 26764 1726882727.56493: done getting variables 26764 1726882727.56550: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:38:47 -0400 (0:00:00.059) 0:00:13.507 ****** 26764 1726882727.56586: entering _queue_task() for managed_node2/set_fact 26764 1726882727.56875: worker is 1 (out of 1 available) 26764 1726882727.56887: exiting _queue_task() for managed_node2/set_fact 26764 1726882727.56898: done queuing things up, now waiting for results queue to drain 26764 1726882727.56899: waiting for pending results... 26764 1726882727.57182: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 26764 1726882727.57301: in run() - task 0e448fcc-3ce9-9875-c9a3-000000000246 26764 1726882727.57313: variable 'ansible_search_path' from source: unknown 26764 1726882727.57317: variable 'ansible_search_path' from source: unknown 26764 1726882727.57354: calling self._execute() 26764 1726882727.57440: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882727.57447: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882727.57467: variable 'omit' from source: magic vars 26764 1726882727.57840: variable 'ansible_distribution_major_version' from source: facts 26764 1726882727.57852: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882727.58039: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 26764 1726882727.58315: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 26764 1726882727.58370: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 26764 1726882727.58427: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 26764 1726882727.58472: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 26764 1726882727.58567: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 26764 1726882727.58594: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 26764 1726882727.58621: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 26764 1726882727.58646: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 26764 1726882727.58735: variable '__network_is_ostree' from source: set_fact 26764 1726882727.58742: Evaluated conditional (not __network_is_ostree is defined): False 26764 1726882727.58745: when evaluation is False, skipping this task 26764 1726882727.58747: _execute() done 26764 1726882727.58750: dumping result to json 26764 1726882727.58755: done dumping result, returning 26764 1726882727.58770: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0e448fcc-3ce9-9875-c9a3-000000000246] 26764 1726882727.58773: sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000246 26764 1726882727.58858: done sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000246 26764 1726882727.58860: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 26764 1726882727.58914: no more pending results, returning what we have 26764 1726882727.58917: results queue empty 26764 1726882727.58919: checking for any_errors_fatal 26764 1726882727.58925: done checking for any_errors_fatal 26764 1726882727.58925: checking for max_fail_percentage 26764 1726882727.58927: done checking for max_fail_percentage 26764 1726882727.58928: checking to see if all hosts have failed and the running result is not ok 26764 1726882727.58929: done checking to see if all hosts have failed 26764 1726882727.58930: getting the remaining hosts for this loop 26764 1726882727.58931: done getting the remaining hosts for this loop 26764 1726882727.58935: getting the next task for host managed_node2 26764 1726882727.58946: done getting next task for host managed_node2 26764 1726882727.58949: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 26764 1726882727.58953: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882727.58971: getting variables 26764 1726882727.58973: in VariableManager get_vars() 26764 1726882727.59013: Calling all_inventory to load vars for managed_node2 26764 1726882727.59016: Calling groups_inventory to load vars for managed_node2 26764 1726882727.59019: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882727.59030: Calling all_plugins_play to load vars for managed_node2 26764 1726882727.59034: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882727.59037: Calling groups_plugins_play to load vars for managed_node2 26764 1726882727.60736: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882727.62514: done with get_vars() 26764 1726882727.62537: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:38:47 -0400 (0:00:00.060) 0:00:13.568 ****** 26764 1726882727.62642: entering _queue_task() for managed_node2/service_facts 26764 1726882727.62934: worker is 1 (out of 1 available) 26764 1726882727.62950: exiting _queue_task() for managed_node2/service_facts 26764 1726882727.62962: done queuing things up, now waiting for results queue to drain 26764 1726882727.62965: waiting for pending results... 26764 1726882727.63248: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 26764 1726882727.63375: in run() - task 0e448fcc-3ce9-9875-c9a3-000000000248 26764 1726882727.63393: variable 'ansible_search_path' from source: unknown 26764 1726882727.63397: variable 'ansible_search_path' from source: unknown 26764 1726882727.63434: calling self._execute() 26764 1726882727.63531: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882727.63537: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882727.63547: variable 'omit' from source: magic vars 26764 1726882727.63936: variable 'ansible_distribution_major_version' from source: facts 26764 1726882727.63952: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882727.63959: variable 'omit' from source: magic vars 26764 1726882727.64018: variable 'omit' from source: magic vars 26764 1726882727.64054: variable 'omit' from source: magic vars 26764 1726882727.64104: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26764 1726882727.64144: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26764 1726882727.64169: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26764 1726882727.64192: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882727.64203: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882727.64233: variable 'inventory_hostname' from source: host vars for 'managed_node2' 26764 1726882727.64236: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882727.64239: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882727.64346: Set connection var ansible_shell_executable to /bin/sh 26764 1726882727.64351: Set connection var ansible_shell_type to sh 26764 1726882727.64369: Set connection var ansible_module_compression to ZIP_DEFLATED 26764 1726882727.64372: Set connection var ansible_timeout to 10 26764 1726882727.64377: Set connection var ansible_connection to ssh 26764 1726882727.64383: Set connection var ansible_pipelining to False 26764 1726882727.64411: variable 'ansible_shell_executable' from source: unknown 26764 1726882727.64414: variable 'ansible_connection' from source: unknown 26764 1726882727.64417: variable 'ansible_module_compression' from source: unknown 26764 1726882727.64419: variable 'ansible_shell_type' from source: unknown 26764 1726882727.64421: variable 'ansible_shell_executable' from source: unknown 26764 1726882727.64423: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882727.64425: variable 'ansible_pipelining' from source: unknown 26764 1726882727.64428: variable 'ansible_timeout' from source: unknown 26764 1726882727.64433: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882727.64649: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 26764 1726882727.64658: variable 'omit' from source: magic vars 26764 1726882727.64667: starting attempt loop 26764 1726882727.64670: running the handler 26764 1726882727.64688: _low_level_execute_command(): starting 26764 1726882727.64696: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 26764 1726882727.65474: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26764 1726882727.65491: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882727.65502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882727.65516: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882727.65556: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882727.65569: stderr chunk (state=3): >>>debug2: match not found <<< 26764 1726882727.65579: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882727.65593: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882727.65602: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882727.65610: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26764 1726882727.65619: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882727.65632: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882727.65643: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882727.65652: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882727.65657: stderr chunk (state=3): >>>debug2: match found <<< 26764 1726882727.65669: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882727.65750: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882727.65776: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882727.65793: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882727.65908: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882727.67577: stdout chunk (state=3): >>>/root <<< 26764 1726882727.67736: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882727.67768: stderr chunk (state=3): >>><<< 26764 1726882727.67771: stdout chunk (state=3): >>><<< 26764 1726882727.67886: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882727.67890: _low_level_execute_command(): starting 26764 1726882727.67893: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882727.6779156-27356-257274429163976 `" && echo ansible-tmp-1726882727.6779156-27356-257274429163976="` echo /root/.ansible/tmp/ansible-tmp-1726882727.6779156-27356-257274429163976 `" ) && sleep 0' 26764 1726882727.68498: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26764 1726882727.68513: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882727.68531: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882727.68552: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882727.68599: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882727.68612: stderr chunk (state=3): >>>debug2: match not found <<< 26764 1726882727.68626: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882727.68647: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882727.68660: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882727.68679: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26764 1726882727.68693: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882727.68708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882727.68724: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882727.68737: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882727.68749: stderr chunk (state=3): >>>debug2: match found <<< 26764 1726882727.68772: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882727.68846: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882727.68869: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882727.68886: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882727.69022: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882727.70897: stdout chunk (state=3): >>>ansible-tmp-1726882727.6779156-27356-257274429163976=/root/.ansible/tmp/ansible-tmp-1726882727.6779156-27356-257274429163976 <<< 26764 1726882727.71015: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882727.71082: stderr chunk (state=3): >>><<< 26764 1726882727.71085: stdout chunk (state=3): >>><<< 26764 1726882727.71172: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882727.6779156-27356-257274429163976=/root/.ansible/tmp/ansible-tmp-1726882727.6779156-27356-257274429163976 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882727.71176: variable 'ansible_module_compression' from source: unknown 26764 1726882727.71419: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-26764trh16hvb/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 26764 1726882727.71422: variable 'ansible_facts' from source: unknown 26764 1726882727.71424: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882727.6779156-27356-257274429163976/AnsiballZ_service_facts.py 26764 1726882727.71488: Sending initial data 26764 1726882727.71491: Sent initial data (162 bytes) 26764 1726882727.72475: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26764 1726882727.72489: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882727.72507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882727.72524: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882727.72570: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882727.72583: stderr chunk (state=3): >>>debug2: match not found <<< 26764 1726882727.72597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882727.72617: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882727.72629: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882727.72641: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26764 1726882727.72653: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882727.72670: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882727.72689: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882727.72702: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882727.72712: stderr chunk (state=3): >>>debug2: match found <<< 26764 1726882727.72728: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882727.72808: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882727.72824: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882727.72841: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882727.72975: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882727.74767: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 26764 1726882727.74859: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 26764 1726882727.74961: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-26764trh16hvb/tmpxo2e4wdy /root/.ansible/tmp/ansible-tmp-1726882727.6779156-27356-257274429163976/AnsiballZ_service_facts.py <<< 26764 1726882727.75055: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 26764 1726882727.76498: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882727.76506: stderr chunk (state=3): >>><<< 26764 1726882727.76509: stdout chunk (state=3): >>><<< 26764 1726882727.76527: done transferring module to remote 26764 1726882727.76539: _low_level_execute_command(): starting 26764 1726882727.76545: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882727.6779156-27356-257274429163976/ /root/.ansible/tmp/ansible-tmp-1726882727.6779156-27356-257274429163976/AnsiballZ_service_facts.py && sleep 0' 26764 1726882727.77199: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26764 1726882727.77208: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882727.77218: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882727.77237: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882727.77277: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882727.77283: stderr chunk (state=3): >>>debug2: match not found <<< 26764 1726882727.77292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882727.77305: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882727.77312: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882727.77320: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26764 1726882727.77328: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882727.77341: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882727.77353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882727.77360: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882727.77369: stderr chunk (state=3): >>>debug2: match found <<< 26764 1726882727.77380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882727.77468: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882727.77473: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882727.77482: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882727.77654: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882727.79423: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882727.79498: stderr chunk (state=3): >>><<< 26764 1726882727.79501: stdout chunk (state=3): >>><<< 26764 1726882727.79518: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882727.79521: _low_level_execute_command(): starting 26764 1726882727.79524: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882727.6779156-27356-257274429163976/AnsiballZ_service_facts.py && sleep 0' 26764 1726882727.80120: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26764 1726882727.80129: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882727.80139: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882727.80153: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882727.80195: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882727.80201: stderr chunk (state=3): >>>debug2: match not found <<< 26764 1726882727.80211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882727.80226: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882727.80229: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882727.80237: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26764 1726882727.80244: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882727.80253: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882727.80274: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882727.80279: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882727.80286: stderr chunk (state=3): >>>debug2: match found <<< 26764 1726882727.80296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882727.80375: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882727.80383: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882727.80398: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882727.80535: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882729.16088: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", <<< 26764 1726882729.16121: stdout chunk (state=3): >>>"source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "s<<< 26764 1726882729.16135: stdout chunk (state=3): >>>tate": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"n<<< 26764 1726882729.16143: stdout chunk (state=3): >>>ame": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", <<< 26764 1726882729.16146: stdout chunk (state=3): >>>"status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "sys<<< 26764 1726882729.16150: stdout chunk (state=3): >>>temd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 26764 1726882729.17468: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 26764 1726882729.17523: stderr chunk (state=3): >>><<< 26764 1726882729.17532: stdout chunk (state=3): >>><<< 26764 1726882729.17651: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 26764 1726882729.18611: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882727.6779156-27356-257274429163976/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 26764 1726882729.18630: _low_level_execute_command(): starting 26764 1726882729.18640: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882727.6779156-27356-257274429163976/ > /dev/null 2>&1 && sleep 0' 26764 1726882729.20603: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26764 1726882729.20618: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882729.20645: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882729.20668: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882729.20718: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882729.20728: stderr chunk (state=3): >>>debug2: match not found <<< 26764 1726882729.20739: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882729.20753: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882729.20763: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882729.20779: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26764 1726882729.20798: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882729.20809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882729.20821: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882729.20830: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882729.20839: stderr chunk (state=3): >>>debug2: match found <<< 26764 1726882729.20849: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882729.20937: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882729.20957: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882729.20976: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882729.21109: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882729.23027: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882729.23031: stdout chunk (state=3): >>><<< 26764 1726882729.23033: stderr chunk (state=3): >>><<< 26764 1726882729.23274: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882729.23277: handler run complete 26764 1726882729.23280: variable 'ansible_facts' from source: unknown 26764 1726882729.23627: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882729.24823: variable 'ansible_facts' from source: unknown 26764 1726882729.25030: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882729.25416: attempt loop complete, returning result 26764 1726882729.25570: _execute() done 26764 1726882729.25577: dumping result to json 26764 1726882729.25636: done dumping result, returning 26764 1726882729.25648: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [0e448fcc-3ce9-9875-c9a3-000000000248] 26764 1726882729.25657: sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000248 ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 26764 1726882729.27139: no more pending results, returning what we have 26764 1726882729.27142: results queue empty 26764 1726882729.27143: checking for any_errors_fatal 26764 1726882729.27150: done checking for any_errors_fatal 26764 1726882729.27150: checking for max_fail_percentage 26764 1726882729.27152: done checking for max_fail_percentage 26764 1726882729.27153: checking to see if all hosts have failed and the running result is not ok 26764 1726882729.27154: done checking to see if all hosts have failed 26764 1726882729.27155: getting the remaining hosts for this loop 26764 1726882729.27156: done getting the remaining hosts for this loop 26764 1726882729.27160: getting the next task for host managed_node2 26764 1726882729.27170: done getting next task for host managed_node2 26764 1726882729.27174: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 26764 1726882729.27178: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882729.27188: getting variables 26764 1726882729.27190: in VariableManager get_vars() 26764 1726882729.27227: Calling all_inventory to load vars for managed_node2 26764 1726882729.27230: Calling groups_inventory to load vars for managed_node2 26764 1726882729.27233: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882729.27245: Calling all_plugins_play to load vars for managed_node2 26764 1726882729.27247: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882729.27250: Calling groups_plugins_play to load vars for managed_node2 26764 1726882729.28273: done sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000248 26764 1726882729.28280: WORKER PROCESS EXITING 26764 1726882729.30542: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882729.34014: done with get_vars() 26764 1726882729.34050: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:38:49 -0400 (0:00:01.715) 0:00:15.283 ****** 26764 1726882729.34158: entering _queue_task() for managed_node2/package_facts 26764 1726882729.34491: worker is 1 (out of 1 available) 26764 1726882729.34503: exiting _queue_task() for managed_node2/package_facts 26764 1726882729.34515: done queuing things up, now waiting for results queue to drain 26764 1726882729.34516: waiting for pending results... 26764 1726882729.34809: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 26764 1726882729.34957: in run() - task 0e448fcc-3ce9-9875-c9a3-000000000249 26764 1726882729.34984: variable 'ansible_search_path' from source: unknown 26764 1726882729.34992: variable 'ansible_search_path' from source: unknown 26764 1726882729.35035: calling self._execute() 26764 1726882729.35142: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882729.35155: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882729.35175: variable 'omit' from source: magic vars 26764 1726882729.35572: variable 'ansible_distribution_major_version' from source: facts 26764 1726882729.35591: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882729.35603: variable 'omit' from source: magic vars 26764 1726882729.35678: variable 'omit' from source: magic vars 26764 1726882729.35711: variable 'omit' from source: magic vars 26764 1726882729.35759: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26764 1726882729.35807: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26764 1726882729.35836: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26764 1726882729.35859: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882729.35887: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882729.35921: variable 'inventory_hostname' from source: host vars for 'managed_node2' 26764 1726882729.35932: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882729.35946: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882729.36053: Set connection var ansible_shell_executable to /bin/sh 26764 1726882729.36062: Set connection var ansible_shell_type to sh 26764 1726882729.36083: Set connection var ansible_module_compression to ZIP_DEFLATED 26764 1726882729.36094: Set connection var ansible_timeout to 10 26764 1726882729.36108: Set connection var ansible_connection to ssh 26764 1726882729.36118: Set connection var ansible_pipelining to False 26764 1726882729.36145: variable 'ansible_shell_executable' from source: unknown 26764 1726882729.36156: variable 'ansible_connection' from source: unknown 26764 1726882729.36171: variable 'ansible_module_compression' from source: unknown 26764 1726882729.36181: variable 'ansible_shell_type' from source: unknown 26764 1726882729.36189: variable 'ansible_shell_executable' from source: unknown 26764 1726882729.36197: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882729.36210: variable 'ansible_pipelining' from source: unknown 26764 1726882729.36217: variable 'ansible_timeout' from source: unknown 26764 1726882729.36224: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882729.36440: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 26764 1726882729.36456: variable 'omit' from source: magic vars 26764 1726882729.36469: starting attempt loop 26764 1726882729.36477: running the handler 26764 1726882729.36498: _low_level_execute_command(): starting 26764 1726882729.36510: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 26764 1726882729.37277: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26764 1726882729.37294: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882729.37310: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882729.37327: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882729.37371: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882729.37387: stderr chunk (state=3): >>>debug2: match not found <<< 26764 1726882729.37402: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882729.37424: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882729.37435: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882729.37446: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26764 1726882729.37456: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882729.37475: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882729.37493: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882729.37507: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882729.37520: stderr chunk (state=3): >>>debug2: match found <<< 26764 1726882729.37535: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882729.37619: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882729.37646: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882729.37661: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882729.37834: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882729.39463: stdout chunk (state=3): >>>/root <<< 26764 1726882729.39643: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882729.39646: stdout chunk (state=3): >>><<< 26764 1726882729.39648: stderr chunk (state=3): >>><<< 26764 1726882729.39762: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882729.39769: _low_level_execute_command(): starting 26764 1726882729.39772: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882729.396707-27424-63184270636409 `" && echo ansible-tmp-1726882729.396707-27424-63184270636409="` echo /root/.ansible/tmp/ansible-tmp-1726882729.396707-27424-63184270636409 `" ) && sleep 0' 26764 1726882729.40682: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26764 1726882729.40862: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882729.40930: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882729.40936: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882729.41027: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882729.42930: stdout chunk (state=3): >>>ansible-tmp-1726882729.396707-27424-63184270636409=/root/.ansible/tmp/ansible-tmp-1726882729.396707-27424-63184270636409 <<< 26764 1726882729.43120: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882729.43123: stderr chunk (state=3): >>><<< 26764 1726882729.43126: stdout chunk (state=3): >>><<< 26764 1726882729.43128: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882729.396707-27424-63184270636409=/root/.ansible/tmp/ansible-tmp-1726882729.396707-27424-63184270636409 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882729.43174: variable 'ansible_module_compression' from source: unknown 26764 1726882729.43224: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-26764trh16hvb/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 26764 1726882729.43288: variable 'ansible_facts' from source: unknown 26764 1726882729.43462: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882729.396707-27424-63184270636409/AnsiballZ_package_facts.py 26764 1726882729.43624: Sending initial data 26764 1726882729.43627: Sent initial data (160 bytes) 26764 1726882729.44543: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26764 1726882729.44551: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882729.44561: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882729.44585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882729.44790: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882729.44794: stderr chunk (state=3): >>>debug2: match not found <<< 26764 1726882729.44796: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882729.44799: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882729.44801: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882729.44803: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26764 1726882729.44804: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882729.44806: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882729.44809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882729.44811: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882729.44813: stderr chunk (state=3): >>>debug2: match found <<< 26764 1726882729.44815: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882729.44817: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882729.44819: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882729.44821: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882729.45589: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882729.47488: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 26764 1726882729.47494: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 26764 1726882729.47627: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-26764trh16hvb/tmp12s2ci1m /root/.ansible/tmp/ansible-tmp-1726882729.396707-27424-63184270636409/AnsiballZ_package_facts.py <<< 26764 1726882729.47698: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 26764 1726882729.50585: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882729.50773: stderr chunk (state=3): >>><<< 26764 1726882729.50776: stdout chunk (state=3): >>><<< 26764 1726882729.50778: done transferring module to remote 26764 1726882729.50872: _low_level_execute_command(): starting 26764 1726882729.50876: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882729.396707-27424-63184270636409/ /root/.ansible/tmp/ansible-tmp-1726882729.396707-27424-63184270636409/AnsiballZ_package_facts.py && sleep 0' 26764 1726882729.52245: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26764 1726882729.52260: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882729.52281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882729.52301: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882729.52349: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882729.52472: stderr chunk (state=3): >>>debug2: match not found <<< 26764 1726882729.52489: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882729.52508: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882729.52520: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882729.52531: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26764 1726882729.52543: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882729.52558: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882729.52583: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882729.52597: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882729.52608: stderr chunk (state=3): >>>debug2: match found <<< 26764 1726882729.52623: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882729.52706: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882729.52796: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882729.52812: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882729.52995: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882729.54891: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882729.54895: stdout chunk (state=3): >>><<< 26764 1726882729.54897: stderr chunk (state=3): >>><<< 26764 1726882729.55003: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882729.55010: _low_level_execute_command(): starting 26764 1726882729.55013: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882729.396707-27424-63184270636409/AnsiballZ_package_facts.py && sleep 0' 26764 1726882729.55654: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26764 1726882729.55672: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882729.55695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882729.55713: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882729.55753: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882729.55769: stderr chunk (state=3): >>>debug2: match not found <<< 26764 1726882729.55788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882729.55810: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882729.55821: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882729.55831: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26764 1726882729.55842: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882729.55854: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882729.55874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882729.55887: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882729.55903: stderr chunk (state=3): >>>debug2: match found <<< 26764 1726882729.55921: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882729.55991: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882729.56021: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882729.56037: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882729.56177: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882730.02561: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": nu<<< 26764 1726882730.02626: stdout chunk (state=3): >>>ll, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]<<< 26764 1726882730.02671: stdout chunk (state=3): >>>, "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "a<<< 26764 1726882730.02712: stdout chunk (state=3): >>>rch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64"<<< 26764 1726882730.02740: stdout chunk (state=3): >>>, "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", <<< 26764 1726882730.02785: stdout chunk (state=3): >>>"release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 26764 1726882730.04279: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 26764 1726882730.04358: stderr chunk (state=3): >>><<< 26764 1726882730.04361: stdout chunk (state=3): >>><<< 26764 1726882730.04483: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 26764 1726882730.07078: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882729.396707-27424-63184270636409/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 26764 1726882730.07105: _low_level_execute_command(): starting 26764 1726882730.07114: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882729.396707-27424-63184270636409/ > /dev/null 2>&1 && sleep 0' 26764 1726882730.07774: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26764 1726882730.07791: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882730.07814: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882730.07833: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882730.07881: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882730.07893: stderr chunk (state=3): >>>debug2: match not found <<< 26764 1726882730.07914: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882730.07933: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882730.07945: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882730.07956: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26764 1726882730.07973: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882730.07987: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882730.08002: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882730.08014: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882730.08032: stderr chunk (state=3): >>>debug2: match found <<< 26764 1726882730.08046: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882730.08147: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882730.08180: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882730.08196: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882730.08331: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882730.10179: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882730.10208: stderr chunk (state=3): >>><<< 26764 1726882730.10211: stdout chunk (state=3): >>><<< 26764 1726882730.10227: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882730.10232: handler run complete 26764 1726882730.11146: variable 'ansible_facts' from source: unknown 26764 1726882730.11640: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882730.13836: variable 'ansible_facts' from source: unknown 26764 1726882730.14326: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882730.15143: attempt loop complete, returning result 26764 1726882730.15155: _execute() done 26764 1726882730.15158: dumping result to json 26764 1726882730.15367: done dumping result, returning 26764 1726882730.15377: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0e448fcc-3ce9-9875-c9a3-000000000249] 26764 1726882730.15382: sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000249 26764 1726882730.17624: done sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000249 26764 1726882730.17628: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 26764 1726882730.17820: no more pending results, returning what we have 26764 1726882730.17823: results queue empty 26764 1726882730.17825: checking for any_errors_fatal 26764 1726882730.17830: done checking for any_errors_fatal 26764 1726882730.17831: checking for max_fail_percentage 26764 1726882730.17833: done checking for max_fail_percentage 26764 1726882730.17834: checking to see if all hosts have failed and the running result is not ok 26764 1726882730.17834: done checking to see if all hosts have failed 26764 1726882730.17835: getting the remaining hosts for this loop 26764 1726882730.17837: done getting the remaining hosts for this loop 26764 1726882730.17840: getting the next task for host managed_node2 26764 1726882730.17848: done getting next task for host managed_node2 26764 1726882730.17851: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 26764 1726882730.17854: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882730.17867: getting variables 26764 1726882730.17869: in VariableManager get_vars() 26764 1726882730.17907: Calling all_inventory to load vars for managed_node2 26764 1726882730.17910: Calling groups_inventory to load vars for managed_node2 26764 1726882730.17913: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882730.17923: Calling all_plugins_play to load vars for managed_node2 26764 1726882730.17926: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882730.17929: Calling groups_plugins_play to load vars for managed_node2 26764 1726882730.19443: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882730.21593: done with get_vars() 26764 1726882730.21620: done getting variables 26764 1726882730.21681: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:38:50 -0400 (0:00:00.875) 0:00:16.158 ****** 26764 1726882730.21713: entering _queue_task() for managed_node2/debug 26764 1726882730.21997: worker is 1 (out of 1 available) 26764 1726882730.22008: exiting _queue_task() for managed_node2/debug 26764 1726882730.22019: done queuing things up, now waiting for results queue to drain 26764 1726882730.22020: waiting for pending results... 26764 1726882730.22293: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 26764 1726882730.22397: in run() - task 0e448fcc-3ce9-9875-c9a3-0000000001e0 26764 1726882730.22409: variable 'ansible_search_path' from source: unknown 26764 1726882730.22413: variable 'ansible_search_path' from source: unknown 26764 1726882730.22444: calling self._execute() 26764 1726882730.22535: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882730.22539: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882730.22549: variable 'omit' from source: magic vars 26764 1726882730.22913: variable 'ansible_distribution_major_version' from source: facts 26764 1726882730.22924: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882730.22931: variable 'omit' from source: magic vars 26764 1726882730.22980: variable 'omit' from source: magic vars 26764 1726882730.23075: variable 'network_provider' from source: set_fact 26764 1726882730.23091: variable 'omit' from source: magic vars 26764 1726882730.23133: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26764 1726882730.23165: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26764 1726882730.23186: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26764 1726882730.23202: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882730.23214: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882730.23247: variable 'inventory_hostname' from source: host vars for 'managed_node2' 26764 1726882730.23250: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882730.23252: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882730.23357: Set connection var ansible_shell_executable to /bin/sh 26764 1726882730.23360: Set connection var ansible_shell_type to sh 26764 1726882730.23375: Set connection var ansible_module_compression to ZIP_DEFLATED 26764 1726882730.23381: Set connection var ansible_timeout to 10 26764 1726882730.23386: Set connection var ansible_connection to ssh 26764 1726882730.23392: Set connection var ansible_pipelining to False 26764 1726882730.23413: variable 'ansible_shell_executable' from source: unknown 26764 1726882730.23416: variable 'ansible_connection' from source: unknown 26764 1726882730.23418: variable 'ansible_module_compression' from source: unknown 26764 1726882730.23421: variable 'ansible_shell_type' from source: unknown 26764 1726882730.23423: variable 'ansible_shell_executable' from source: unknown 26764 1726882730.23425: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882730.23427: variable 'ansible_pipelining' from source: unknown 26764 1726882730.23431: variable 'ansible_timeout' from source: unknown 26764 1726882730.23435: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882730.23572: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 26764 1726882730.23582: variable 'omit' from source: magic vars 26764 1726882730.23588: starting attempt loop 26764 1726882730.23591: running the handler 26764 1726882730.23632: handler run complete 26764 1726882730.23645: attempt loop complete, returning result 26764 1726882730.23648: _execute() done 26764 1726882730.23651: dumping result to json 26764 1726882730.23653: done dumping result, returning 26764 1726882730.23667: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [0e448fcc-3ce9-9875-c9a3-0000000001e0] 26764 1726882730.23676: sending task result for task 0e448fcc-3ce9-9875-c9a3-0000000001e0 26764 1726882730.23757: done sending task result for task 0e448fcc-3ce9-9875-c9a3-0000000001e0 26764 1726882730.23759: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: Using network provider: nm 26764 1726882730.23821: no more pending results, returning what we have 26764 1726882730.23824: results queue empty 26764 1726882730.23825: checking for any_errors_fatal 26764 1726882730.23835: done checking for any_errors_fatal 26764 1726882730.23836: checking for max_fail_percentage 26764 1726882730.23838: done checking for max_fail_percentage 26764 1726882730.23839: checking to see if all hosts have failed and the running result is not ok 26764 1726882730.23839: done checking to see if all hosts have failed 26764 1726882730.23840: getting the remaining hosts for this loop 26764 1726882730.23842: done getting the remaining hosts for this loop 26764 1726882730.23845: getting the next task for host managed_node2 26764 1726882730.23852: done getting next task for host managed_node2 26764 1726882730.23855: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 26764 1726882730.23859: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882730.23874: getting variables 26764 1726882730.23876: in VariableManager get_vars() 26764 1726882730.23914: Calling all_inventory to load vars for managed_node2 26764 1726882730.23917: Calling groups_inventory to load vars for managed_node2 26764 1726882730.23919: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882730.23930: Calling all_plugins_play to load vars for managed_node2 26764 1726882730.23933: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882730.23935: Calling groups_plugins_play to load vars for managed_node2 26764 1726882730.25471: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882730.27346: done with get_vars() 26764 1726882730.27369: done getting variables 26764 1726882730.27420: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:38:50 -0400 (0:00:00.057) 0:00:16.216 ****** 26764 1726882730.27448: entering _queue_task() for managed_node2/fail 26764 1726882730.27688: worker is 1 (out of 1 available) 26764 1726882730.27699: exiting _queue_task() for managed_node2/fail 26764 1726882730.27711: done queuing things up, now waiting for results queue to drain 26764 1726882730.27712: waiting for pending results... 26764 1726882730.27973: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 26764 1726882730.28077: in run() - task 0e448fcc-3ce9-9875-c9a3-0000000001e1 26764 1726882730.28089: variable 'ansible_search_path' from source: unknown 26764 1726882730.28094: variable 'ansible_search_path' from source: unknown 26764 1726882730.28124: calling self._execute() 26764 1726882730.28210: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882730.28214: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882730.28226: variable 'omit' from source: magic vars 26764 1726882730.28576: variable 'ansible_distribution_major_version' from source: facts 26764 1726882730.28587: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882730.28714: variable 'network_state' from source: role '' defaults 26764 1726882730.28722: Evaluated conditional (network_state != {}): False 26764 1726882730.28725: when evaluation is False, skipping this task 26764 1726882730.28728: _execute() done 26764 1726882730.28731: dumping result to json 26764 1726882730.28735: done dumping result, returning 26764 1726882730.28741: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0e448fcc-3ce9-9875-c9a3-0000000001e1] 26764 1726882730.28748: sending task result for task 0e448fcc-3ce9-9875-c9a3-0000000001e1 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 26764 1726882730.28883: no more pending results, returning what we have 26764 1726882730.28886: results queue empty 26764 1726882730.28888: checking for any_errors_fatal 26764 1726882730.28893: done checking for any_errors_fatal 26764 1726882730.28894: checking for max_fail_percentage 26764 1726882730.28896: done checking for max_fail_percentage 26764 1726882730.28897: checking to see if all hosts have failed and the running result is not ok 26764 1726882730.28897: done checking to see if all hosts have failed 26764 1726882730.28898: getting the remaining hosts for this loop 26764 1726882730.28899: done getting the remaining hosts for this loop 26764 1726882730.28903: getting the next task for host managed_node2 26764 1726882730.28910: done getting next task for host managed_node2 26764 1726882730.28913: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 26764 1726882730.28917: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882730.28933: done sending task result for task 0e448fcc-3ce9-9875-c9a3-0000000001e1 26764 1726882730.28944: getting variables 26764 1726882730.28946: in VariableManager get_vars() 26764 1726882730.28988: Calling all_inventory to load vars for managed_node2 26764 1726882730.28991: Calling groups_inventory to load vars for managed_node2 26764 1726882730.28994: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882730.29007: Calling all_plugins_play to load vars for managed_node2 26764 1726882730.29010: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882730.29014: Calling groups_plugins_play to load vars for managed_node2 26764 1726882730.29533: WORKER PROCESS EXITING 26764 1726882730.30533: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882730.32338: done with get_vars() 26764 1726882730.32362: done getting variables 26764 1726882730.32424: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:38:50 -0400 (0:00:00.050) 0:00:16.266 ****** 26764 1726882730.32455: entering _queue_task() for managed_node2/fail 26764 1726882730.32712: worker is 1 (out of 1 available) 26764 1726882730.32722: exiting _queue_task() for managed_node2/fail 26764 1726882730.32732: done queuing things up, now waiting for results queue to drain 26764 1726882730.32733: waiting for pending results... 26764 1726882730.33007: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 26764 1726882730.33103: in run() - task 0e448fcc-3ce9-9875-c9a3-0000000001e2 26764 1726882730.33114: variable 'ansible_search_path' from source: unknown 26764 1726882730.33118: variable 'ansible_search_path' from source: unknown 26764 1726882730.33153: calling self._execute() 26764 1726882730.33246: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882730.33252: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882730.33265: variable 'omit' from source: magic vars 26764 1726882730.33625: variable 'ansible_distribution_major_version' from source: facts 26764 1726882730.33637: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882730.33752: variable 'network_state' from source: role '' defaults 26764 1726882730.33761: Evaluated conditional (network_state != {}): False 26764 1726882730.33766: when evaluation is False, skipping this task 26764 1726882730.33769: _execute() done 26764 1726882730.33774: dumping result to json 26764 1726882730.33777: done dumping result, returning 26764 1726882730.33786: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0e448fcc-3ce9-9875-c9a3-0000000001e2] 26764 1726882730.33791: sending task result for task 0e448fcc-3ce9-9875-c9a3-0000000001e2 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 26764 1726882730.33927: no more pending results, returning what we have 26764 1726882730.33930: results queue empty 26764 1726882730.33931: checking for any_errors_fatal 26764 1726882730.33937: done checking for any_errors_fatal 26764 1726882730.33938: checking for max_fail_percentage 26764 1726882730.33940: done checking for max_fail_percentage 26764 1726882730.33941: checking to see if all hosts have failed and the running result is not ok 26764 1726882730.33942: done checking to see if all hosts have failed 26764 1726882730.33942: getting the remaining hosts for this loop 26764 1726882730.33943: done getting the remaining hosts for this loop 26764 1726882730.33947: getting the next task for host managed_node2 26764 1726882730.33954: done getting next task for host managed_node2 26764 1726882730.33958: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 26764 1726882730.33961: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882730.33983: done sending task result for task 0e448fcc-3ce9-9875-c9a3-0000000001e2 26764 1726882730.33988: WORKER PROCESS EXITING 26764 1726882730.34000: getting variables 26764 1726882730.34002: in VariableManager get_vars() 26764 1726882730.34039: Calling all_inventory to load vars for managed_node2 26764 1726882730.34042: Calling groups_inventory to load vars for managed_node2 26764 1726882730.34044: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882730.34056: Calling all_plugins_play to load vars for managed_node2 26764 1726882730.34059: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882730.34062: Calling groups_plugins_play to load vars for managed_node2 26764 1726882730.36622: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882730.39100: done with get_vars() 26764 1726882730.39127: done getting variables 26764 1726882730.39193: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:38:50 -0400 (0:00:00.067) 0:00:16.333 ****** 26764 1726882730.39227: entering _queue_task() for managed_node2/fail 26764 1726882730.39760: worker is 1 (out of 1 available) 26764 1726882730.40305: exiting _queue_task() for managed_node2/fail 26764 1726882730.40315: done queuing things up, now waiting for results queue to drain 26764 1726882730.40316: waiting for pending results... 26764 1726882730.40603: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 26764 1726882730.40730: in run() - task 0e448fcc-3ce9-9875-c9a3-0000000001e3 26764 1726882730.40749: variable 'ansible_search_path' from source: unknown 26764 1726882730.40762: variable 'ansible_search_path' from source: unknown 26764 1726882730.40806: calling self._execute() 26764 1726882730.40900: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882730.40912: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882730.40925: variable 'omit' from source: magic vars 26764 1726882730.41288: variable 'ansible_distribution_major_version' from source: facts 26764 1726882730.41311: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882730.41482: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 26764 1726882730.43834: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 26764 1726882730.43910: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 26764 1726882730.43951: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 26764 1726882730.43995: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 26764 1726882730.44032: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 26764 1726882730.44114: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26764 1726882730.44169: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26764 1726882730.44201: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26764 1726882730.44255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26764 1726882730.44279: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26764 1726882730.44384: variable 'ansible_distribution_major_version' from source: facts 26764 1726882730.44405: Evaluated conditional (ansible_distribution_major_version | int > 9): False 26764 1726882730.44414: when evaluation is False, skipping this task 26764 1726882730.44421: _execute() done 26764 1726882730.44430: dumping result to json 26764 1726882730.44437: done dumping result, returning 26764 1726882730.44453: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0e448fcc-3ce9-9875-c9a3-0000000001e3] 26764 1726882730.44462: sending task result for task 0e448fcc-3ce9-9875-c9a3-0000000001e3 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 26764 1726882730.44617: no more pending results, returning what we have 26764 1726882730.44620: results queue empty 26764 1726882730.44621: checking for any_errors_fatal 26764 1726882730.44628: done checking for any_errors_fatal 26764 1726882730.44629: checking for max_fail_percentage 26764 1726882730.44630: done checking for max_fail_percentage 26764 1726882730.44637: checking to see if all hosts have failed and the running result is not ok 26764 1726882730.44638: done checking to see if all hosts have failed 26764 1726882730.44639: getting the remaining hosts for this loop 26764 1726882730.44640: done getting the remaining hosts for this loop 26764 1726882730.44643: getting the next task for host managed_node2 26764 1726882730.44650: done getting next task for host managed_node2 26764 1726882730.44654: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 26764 1726882730.44656: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882730.44668: done sending task result for task 0e448fcc-3ce9-9875-c9a3-0000000001e3 26764 1726882730.44674: WORKER PROCESS EXITING 26764 1726882730.44686: getting variables 26764 1726882730.44688: in VariableManager get_vars() 26764 1726882730.44725: Calling all_inventory to load vars for managed_node2 26764 1726882730.44728: Calling groups_inventory to load vars for managed_node2 26764 1726882730.44730: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882730.44740: Calling all_plugins_play to load vars for managed_node2 26764 1726882730.44742: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882730.44745: Calling groups_plugins_play to load vars for managed_node2 26764 1726882730.45958: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882730.46907: done with get_vars() 26764 1726882730.46922: done getting variables 26764 1726882730.46962: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:38:50 -0400 (0:00:00.077) 0:00:16.411 ****** 26764 1726882730.46986: entering _queue_task() for managed_node2/dnf 26764 1726882730.47196: worker is 1 (out of 1 available) 26764 1726882730.47208: exiting _queue_task() for managed_node2/dnf 26764 1726882730.47220: done queuing things up, now waiting for results queue to drain 26764 1726882730.47221: waiting for pending results... 26764 1726882730.47435: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 26764 1726882730.47528: in run() - task 0e448fcc-3ce9-9875-c9a3-0000000001e4 26764 1726882730.47538: variable 'ansible_search_path' from source: unknown 26764 1726882730.47541: variable 'ansible_search_path' from source: unknown 26764 1726882730.47574: calling self._execute() 26764 1726882730.47640: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882730.47645: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882730.47653: variable 'omit' from source: magic vars 26764 1726882730.47922: variable 'ansible_distribution_major_version' from source: facts 26764 1726882730.47933: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882730.48065: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 26764 1726882730.50001: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 26764 1726882730.50041: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 26764 1726882730.50073: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 26764 1726882730.50100: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 26764 1726882730.50119: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 26764 1726882730.50179: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26764 1726882730.50198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26764 1726882730.50215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26764 1726882730.50240: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26764 1726882730.50251: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26764 1726882730.50330: variable 'ansible_distribution' from source: facts 26764 1726882730.50334: variable 'ansible_distribution_major_version' from source: facts 26764 1726882730.50345: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 26764 1726882730.50421: variable '__network_wireless_connections_defined' from source: role '' defaults 26764 1726882730.50505: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26764 1726882730.50522: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26764 1726882730.50538: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26764 1726882730.50563: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26764 1726882730.50577: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26764 1726882730.50606: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26764 1726882730.50621: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26764 1726882730.50637: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26764 1726882730.50661: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26764 1726882730.50683: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26764 1726882730.50711: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26764 1726882730.50727: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26764 1726882730.50743: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26764 1726882730.50770: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26764 1726882730.50782: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26764 1726882730.50882: variable 'network_connections' from source: include params 26764 1726882730.50890: variable 'interface' from source: play vars 26764 1726882730.50939: variable 'interface' from source: play vars 26764 1726882730.50998: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 26764 1726882730.51104: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 26764 1726882730.51130: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 26764 1726882730.51153: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 26764 1726882730.51177: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 26764 1726882730.51207: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 26764 1726882730.51223: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 26764 1726882730.51248: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 26764 1726882730.51263: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 26764 1726882730.51307: variable '__network_team_connections_defined' from source: role '' defaults 26764 1726882730.51448: variable 'network_connections' from source: include params 26764 1726882730.51452: variable 'interface' from source: play vars 26764 1726882730.51499: variable 'interface' from source: play vars 26764 1726882730.51523: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 26764 1726882730.51526: when evaluation is False, skipping this task 26764 1726882730.51529: _execute() done 26764 1726882730.51531: dumping result to json 26764 1726882730.51534: done dumping result, returning 26764 1726882730.51540: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0e448fcc-3ce9-9875-c9a3-0000000001e4] 26764 1726882730.51545: sending task result for task 0e448fcc-3ce9-9875-c9a3-0000000001e4 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 26764 1726882730.51670: no more pending results, returning what we have 26764 1726882730.51673: results queue empty 26764 1726882730.51675: checking for any_errors_fatal 26764 1726882730.51683: done checking for any_errors_fatal 26764 1726882730.51684: checking for max_fail_percentage 26764 1726882730.51685: done checking for max_fail_percentage 26764 1726882730.51686: checking to see if all hosts have failed and the running result is not ok 26764 1726882730.51687: done checking to see if all hosts have failed 26764 1726882730.51687: getting the remaining hosts for this loop 26764 1726882730.51689: done getting the remaining hosts for this loop 26764 1726882730.51692: getting the next task for host managed_node2 26764 1726882730.51699: done getting next task for host managed_node2 26764 1726882730.51703: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 26764 1726882730.51705: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882730.51720: getting variables 26764 1726882730.51722: in VariableManager get_vars() 26764 1726882730.51757: Calling all_inventory to load vars for managed_node2 26764 1726882730.51759: Calling groups_inventory to load vars for managed_node2 26764 1726882730.51761: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882730.51771: Calling all_plugins_play to load vars for managed_node2 26764 1726882730.51774: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882730.51776: Calling groups_plugins_play to load vars for managed_node2 26764 1726882730.52348: done sending task result for task 0e448fcc-3ce9-9875-c9a3-0000000001e4 26764 1726882730.52351: WORKER PROCESS EXITING 26764 1726882730.52646: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882730.53577: done with get_vars() 26764 1726882730.53592: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 26764 1726882730.53642: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:38:50 -0400 (0:00:00.066) 0:00:16.478 ****** 26764 1726882730.53676: entering _queue_task() for managed_node2/yum 26764 1726882730.53871: worker is 1 (out of 1 available) 26764 1726882730.53884: exiting _queue_task() for managed_node2/yum 26764 1726882730.53894: done queuing things up, now waiting for results queue to drain 26764 1726882730.53895: waiting for pending results... 26764 1726882730.54054: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 26764 1726882730.54130: in run() - task 0e448fcc-3ce9-9875-c9a3-0000000001e5 26764 1726882730.54140: variable 'ansible_search_path' from source: unknown 26764 1726882730.54143: variable 'ansible_search_path' from source: unknown 26764 1726882730.54174: calling self._execute() 26764 1726882730.54239: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882730.54243: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882730.54253: variable 'omit' from source: magic vars 26764 1726882730.54506: variable 'ansible_distribution_major_version' from source: facts 26764 1726882730.54516: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882730.54631: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 26764 1726882730.56784: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 26764 1726882730.56835: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 26764 1726882730.56869: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 26764 1726882730.56896: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 26764 1726882730.56917: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 26764 1726882730.56972: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26764 1726882730.57000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26764 1726882730.57022: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26764 1726882730.57049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26764 1726882730.57060: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26764 1726882730.57126: variable 'ansible_distribution_major_version' from source: facts 26764 1726882730.57136: Evaluated conditional (ansible_distribution_major_version | int < 8): False 26764 1726882730.57139: when evaluation is False, skipping this task 26764 1726882730.57142: _execute() done 26764 1726882730.57144: dumping result to json 26764 1726882730.57149: done dumping result, returning 26764 1726882730.57155: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0e448fcc-3ce9-9875-c9a3-0000000001e5] 26764 1726882730.57160: sending task result for task 0e448fcc-3ce9-9875-c9a3-0000000001e5 26764 1726882730.57246: done sending task result for task 0e448fcc-3ce9-9875-c9a3-0000000001e5 26764 1726882730.57248: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 26764 1726882730.57295: no more pending results, returning what we have 26764 1726882730.57298: results queue empty 26764 1726882730.57299: checking for any_errors_fatal 26764 1726882730.57305: done checking for any_errors_fatal 26764 1726882730.57305: checking for max_fail_percentage 26764 1726882730.57307: done checking for max_fail_percentage 26764 1726882730.57308: checking to see if all hosts have failed and the running result is not ok 26764 1726882730.57308: done checking to see if all hosts have failed 26764 1726882730.57309: getting the remaining hosts for this loop 26764 1726882730.57310: done getting the remaining hosts for this loop 26764 1726882730.57313: getting the next task for host managed_node2 26764 1726882730.57319: done getting next task for host managed_node2 26764 1726882730.57323: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 26764 1726882730.57325: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882730.57339: getting variables 26764 1726882730.57340: in VariableManager get_vars() 26764 1726882730.57375: Calling all_inventory to load vars for managed_node2 26764 1726882730.57377: Calling groups_inventory to load vars for managed_node2 26764 1726882730.57379: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882730.57388: Calling all_plugins_play to load vars for managed_node2 26764 1726882730.57390: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882730.57392: Calling groups_plugins_play to load vars for managed_node2 26764 1726882730.58145: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882730.59147: done with get_vars() 26764 1726882730.59161: done getting variables 26764 1726882730.59203: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:38:50 -0400 (0:00:00.055) 0:00:16.533 ****** 26764 1726882730.59224: entering _queue_task() for managed_node2/fail 26764 1726882730.59417: worker is 1 (out of 1 available) 26764 1726882730.59430: exiting _queue_task() for managed_node2/fail 26764 1726882730.59442: done queuing things up, now waiting for results queue to drain 26764 1726882730.59443: waiting for pending results... 26764 1726882730.59606: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 26764 1726882730.59684: in run() - task 0e448fcc-3ce9-9875-c9a3-0000000001e6 26764 1726882730.59694: variable 'ansible_search_path' from source: unknown 26764 1726882730.59698: variable 'ansible_search_path' from source: unknown 26764 1726882730.59725: calling self._execute() 26764 1726882730.59793: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882730.59797: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882730.59806: variable 'omit' from source: magic vars 26764 1726882730.60059: variable 'ansible_distribution_major_version' from source: facts 26764 1726882730.60072: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882730.60151: variable '__network_wireless_connections_defined' from source: role '' defaults 26764 1726882730.60285: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 26764 1726882730.61790: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 26764 1726882730.61831: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 26764 1726882730.61858: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 26764 1726882730.61889: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 26764 1726882730.61908: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 26764 1726882730.61963: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26764 1726882730.61996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26764 1726882730.62014: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26764 1726882730.62043: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26764 1726882730.62054: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26764 1726882730.62089: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26764 1726882730.62105: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26764 1726882730.62121: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26764 1726882730.62146: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26764 1726882730.62156: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26764 1726882730.62188: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26764 1726882730.62205: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26764 1726882730.62221: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26764 1726882730.62245: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26764 1726882730.62255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26764 1726882730.62366: variable 'network_connections' from source: include params 26764 1726882730.62377: variable 'interface' from source: play vars 26764 1726882730.62426: variable 'interface' from source: play vars 26764 1726882730.62475: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 26764 1726882730.62580: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 26764 1726882730.62607: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 26764 1726882730.62630: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 26764 1726882730.62651: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 26764 1726882730.62688: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 26764 1726882730.62703: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 26764 1726882730.62721: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 26764 1726882730.62739: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 26764 1726882730.62785: variable '__network_team_connections_defined' from source: role '' defaults 26764 1726882730.62933: variable 'network_connections' from source: include params 26764 1726882730.62936: variable 'interface' from source: play vars 26764 1726882730.62983: variable 'interface' from source: play vars 26764 1726882730.63006: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 26764 1726882730.63010: when evaluation is False, skipping this task 26764 1726882730.63012: _execute() done 26764 1726882730.63015: dumping result to json 26764 1726882730.63018: done dumping result, returning 26764 1726882730.63024: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-9875-c9a3-0000000001e6] 26764 1726882730.63029: sending task result for task 0e448fcc-3ce9-9875-c9a3-0000000001e6 26764 1726882730.63116: done sending task result for task 0e448fcc-3ce9-9875-c9a3-0000000001e6 26764 1726882730.63119: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 26764 1726882730.63197: no more pending results, returning what we have 26764 1726882730.63200: results queue empty 26764 1726882730.63201: checking for any_errors_fatal 26764 1726882730.63205: done checking for any_errors_fatal 26764 1726882730.63206: checking for max_fail_percentage 26764 1726882730.63207: done checking for max_fail_percentage 26764 1726882730.63208: checking to see if all hosts have failed and the running result is not ok 26764 1726882730.63209: done checking to see if all hosts have failed 26764 1726882730.63210: getting the remaining hosts for this loop 26764 1726882730.63211: done getting the remaining hosts for this loop 26764 1726882730.63214: getting the next task for host managed_node2 26764 1726882730.63219: done getting next task for host managed_node2 26764 1726882730.63222: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 26764 1726882730.63224: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882730.63237: getting variables 26764 1726882730.63238: in VariableManager get_vars() 26764 1726882730.63273: Calling all_inventory to load vars for managed_node2 26764 1726882730.63279: Calling groups_inventory to load vars for managed_node2 26764 1726882730.63282: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882730.63290: Calling all_plugins_play to load vars for managed_node2 26764 1726882730.63292: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882730.63295: Calling groups_plugins_play to load vars for managed_node2 26764 1726882730.64062: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882730.64978: done with get_vars() 26764 1726882730.64993: done getting variables 26764 1726882730.65034: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:38:50 -0400 (0:00:00.058) 0:00:16.592 ****** 26764 1726882730.65056: entering _queue_task() for managed_node2/package 26764 1726882730.65255: worker is 1 (out of 1 available) 26764 1726882730.65271: exiting _queue_task() for managed_node2/package 26764 1726882730.65284: done queuing things up, now waiting for results queue to drain 26764 1726882730.65285: waiting for pending results... 26764 1726882730.65442: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 26764 1726882730.65538: in run() - task 0e448fcc-3ce9-9875-c9a3-0000000001e7 26764 1726882730.65548: variable 'ansible_search_path' from source: unknown 26764 1726882730.65551: variable 'ansible_search_path' from source: unknown 26764 1726882730.65586: calling self._execute() 26764 1726882730.65653: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882730.65656: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882730.65669: variable 'omit' from source: magic vars 26764 1726882730.65926: variable 'ansible_distribution_major_version' from source: facts 26764 1726882730.65936: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882730.66067: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 26764 1726882730.66251: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 26764 1726882730.66288: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 26764 1726882730.66313: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 26764 1726882730.66365: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 26764 1726882730.66439: variable 'network_packages' from source: role '' defaults 26764 1726882730.66511: variable '__network_provider_setup' from source: role '' defaults 26764 1726882730.66520: variable '__network_service_name_default_nm' from source: role '' defaults 26764 1726882730.66572: variable '__network_service_name_default_nm' from source: role '' defaults 26764 1726882730.66579: variable '__network_packages_default_nm' from source: role '' defaults 26764 1726882730.66623: variable '__network_packages_default_nm' from source: role '' defaults 26764 1726882730.66736: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 26764 1726882730.68309: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 26764 1726882730.68347: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 26764 1726882730.68376: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 26764 1726882730.68400: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 26764 1726882730.68419: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 26764 1726882730.68478: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26764 1726882730.68499: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26764 1726882730.68516: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26764 1726882730.68541: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26764 1726882730.68552: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26764 1726882730.68585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26764 1726882730.68603: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26764 1726882730.68620: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26764 1726882730.68644: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26764 1726882730.68654: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26764 1726882730.68792: variable '__network_packages_default_gobject_packages' from source: role '' defaults 26764 1726882730.68860: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26764 1726882730.68881: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26764 1726882730.68897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26764 1726882730.68923: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26764 1726882730.68934: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26764 1726882730.68995: variable 'ansible_python' from source: facts 26764 1726882730.69011: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 26764 1726882730.69067: variable '__network_wpa_supplicant_required' from source: role '' defaults 26764 1726882730.69122: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 26764 1726882730.69207: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26764 1726882730.69223: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26764 1726882730.69244: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26764 1726882730.69270: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26764 1726882730.69281: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26764 1726882730.69312: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26764 1726882730.69331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26764 1726882730.69349: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26764 1726882730.69380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26764 1726882730.69390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26764 1726882730.69485: variable 'network_connections' from source: include params 26764 1726882730.69488: variable 'interface' from source: play vars 26764 1726882730.69557: variable 'interface' from source: play vars 26764 1726882730.69615: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 26764 1726882730.69633: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 26764 1726882730.69653: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 26764 1726882730.69680: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 26764 1726882730.69714: variable '__network_wireless_connections_defined' from source: role '' defaults 26764 1726882730.69907: variable 'network_connections' from source: include params 26764 1726882730.69911: variable 'interface' from source: play vars 26764 1726882730.69983: variable 'interface' from source: play vars 26764 1726882730.70021: variable '__network_packages_default_wireless' from source: role '' defaults 26764 1726882730.70086: variable '__network_wireless_connections_defined' from source: role '' defaults 26764 1726882730.70293: variable 'network_connections' from source: include params 26764 1726882730.70296: variable 'interface' from source: play vars 26764 1726882730.70342: variable 'interface' from source: play vars 26764 1726882730.70360: variable '__network_packages_default_team' from source: role '' defaults 26764 1726882730.70416: variable '__network_team_connections_defined' from source: role '' defaults 26764 1726882730.70730: variable 'network_connections' from source: include params 26764 1726882730.70742: variable 'interface' from source: play vars 26764 1726882730.70814: variable 'interface' from source: play vars 26764 1726882730.70877: variable '__network_service_name_default_initscripts' from source: role '' defaults 26764 1726882730.70942: variable '__network_service_name_default_initscripts' from source: role '' defaults 26764 1726882730.70954: variable '__network_packages_default_initscripts' from source: role '' defaults 26764 1726882730.71024: variable '__network_packages_default_initscripts' from source: role '' defaults 26764 1726882730.71257: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 26764 1726882730.71747: variable 'network_connections' from source: include params 26764 1726882730.71750: variable 'interface' from source: play vars 26764 1726882730.71800: variable 'interface' from source: play vars 26764 1726882730.71807: variable 'ansible_distribution' from source: facts 26764 1726882730.71810: variable '__network_rh_distros' from source: role '' defaults 26764 1726882730.71815: variable 'ansible_distribution_major_version' from source: facts 26764 1726882730.71836: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 26764 1726882730.71958: variable 'ansible_distribution' from source: facts 26764 1726882730.71962: variable '__network_rh_distros' from source: role '' defaults 26764 1726882730.71970: variable 'ansible_distribution_major_version' from source: facts 26764 1726882730.71980: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 26764 1726882730.72082: variable 'ansible_distribution' from source: facts 26764 1726882730.72087: variable '__network_rh_distros' from source: role '' defaults 26764 1726882730.72092: variable 'ansible_distribution_major_version' from source: facts 26764 1726882730.72115: variable 'network_provider' from source: set_fact 26764 1726882730.72126: variable 'ansible_facts' from source: unknown 26764 1726882730.72515: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 26764 1726882730.72520: when evaluation is False, skipping this task 26764 1726882730.72523: _execute() done 26764 1726882730.72525: dumping result to json 26764 1726882730.72527: done dumping result, returning 26764 1726882730.72535: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [0e448fcc-3ce9-9875-c9a3-0000000001e7] 26764 1726882730.72539: sending task result for task 0e448fcc-3ce9-9875-c9a3-0000000001e7 26764 1726882730.72623: done sending task result for task 0e448fcc-3ce9-9875-c9a3-0000000001e7 26764 1726882730.72625: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 26764 1726882730.72679: no more pending results, returning what we have 26764 1726882730.72683: results queue empty 26764 1726882730.72684: checking for any_errors_fatal 26764 1726882730.72692: done checking for any_errors_fatal 26764 1726882730.72693: checking for max_fail_percentage 26764 1726882730.72695: done checking for max_fail_percentage 26764 1726882730.72696: checking to see if all hosts have failed and the running result is not ok 26764 1726882730.72696: done checking to see if all hosts have failed 26764 1726882730.72697: getting the remaining hosts for this loop 26764 1726882730.72698: done getting the remaining hosts for this loop 26764 1726882730.72702: getting the next task for host managed_node2 26764 1726882730.72710: done getting next task for host managed_node2 26764 1726882730.72713: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 26764 1726882730.72716: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882730.72730: getting variables 26764 1726882730.72732: in VariableManager get_vars() 26764 1726882730.72771: Calling all_inventory to load vars for managed_node2 26764 1726882730.72774: Calling groups_inventory to load vars for managed_node2 26764 1726882730.72777: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882730.72786: Calling all_plugins_play to load vars for managed_node2 26764 1726882730.72788: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882730.72790: Calling groups_plugins_play to load vars for managed_node2 26764 1726882730.73819: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882730.75013: done with get_vars() 26764 1726882730.75034: done getting variables 26764 1726882730.75088: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:38:50 -0400 (0:00:00.100) 0:00:16.692 ****** 26764 1726882730.75115: entering _queue_task() for managed_node2/package 26764 1726882730.75348: worker is 1 (out of 1 available) 26764 1726882730.75362: exiting _queue_task() for managed_node2/package 26764 1726882730.75376: done queuing things up, now waiting for results queue to drain 26764 1726882730.75377: waiting for pending results... 26764 1726882730.75543: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 26764 1726882730.75625: in run() - task 0e448fcc-3ce9-9875-c9a3-0000000001e8 26764 1726882730.75636: variable 'ansible_search_path' from source: unknown 26764 1726882730.75640: variable 'ansible_search_path' from source: unknown 26764 1726882730.75671: calling self._execute() 26764 1726882730.75740: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882730.75743: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882730.75752: variable 'omit' from source: magic vars 26764 1726882730.76012: variable 'ansible_distribution_major_version' from source: facts 26764 1726882730.76021: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882730.76105: variable 'network_state' from source: role '' defaults 26764 1726882730.76113: Evaluated conditional (network_state != {}): False 26764 1726882730.76119: when evaluation is False, skipping this task 26764 1726882730.76122: _execute() done 26764 1726882730.76125: dumping result to json 26764 1726882730.76129: done dumping result, returning 26764 1726882730.76136: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0e448fcc-3ce9-9875-c9a3-0000000001e8] 26764 1726882730.76143: sending task result for task 0e448fcc-3ce9-9875-c9a3-0000000001e8 26764 1726882730.76231: done sending task result for task 0e448fcc-3ce9-9875-c9a3-0000000001e8 26764 1726882730.76234: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 26764 1726882730.76292: no more pending results, returning what we have 26764 1726882730.76296: results queue empty 26764 1726882730.76297: checking for any_errors_fatal 26764 1726882730.76301: done checking for any_errors_fatal 26764 1726882730.76302: checking for max_fail_percentage 26764 1726882730.76303: done checking for max_fail_percentage 26764 1726882730.76304: checking to see if all hosts have failed and the running result is not ok 26764 1726882730.76305: done checking to see if all hosts have failed 26764 1726882730.76306: getting the remaining hosts for this loop 26764 1726882730.76307: done getting the remaining hosts for this loop 26764 1726882730.76310: getting the next task for host managed_node2 26764 1726882730.76316: done getting next task for host managed_node2 26764 1726882730.76319: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 26764 1726882730.76322: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882730.76335: getting variables 26764 1726882730.76337: in VariableManager get_vars() 26764 1726882730.76370: Calling all_inventory to load vars for managed_node2 26764 1726882730.76373: Calling groups_inventory to load vars for managed_node2 26764 1726882730.76374: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882730.76383: Calling all_plugins_play to load vars for managed_node2 26764 1726882730.76385: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882730.76388: Calling groups_plugins_play to load vars for managed_node2 26764 1726882730.77146: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882730.78070: done with get_vars() 26764 1726882730.78084: done getting variables 26764 1726882730.78124: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:38:50 -0400 (0:00:00.030) 0:00:16.723 ****** 26764 1726882730.78143: entering _queue_task() for managed_node2/package 26764 1726882730.78323: worker is 1 (out of 1 available) 26764 1726882730.78337: exiting _queue_task() for managed_node2/package 26764 1726882730.78348: done queuing things up, now waiting for results queue to drain 26764 1726882730.78349: waiting for pending results... 26764 1726882730.78499: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 26764 1726882730.78579: in run() - task 0e448fcc-3ce9-9875-c9a3-0000000001e9 26764 1726882730.78590: variable 'ansible_search_path' from source: unknown 26764 1726882730.78593: variable 'ansible_search_path' from source: unknown 26764 1726882730.78622: calling self._execute() 26764 1726882730.78687: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882730.78693: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882730.78703: variable 'omit' from source: magic vars 26764 1726882730.78942: variable 'ansible_distribution_major_version' from source: facts 26764 1726882730.78951: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882730.79033: variable 'network_state' from source: role '' defaults 26764 1726882730.79039: Evaluated conditional (network_state != {}): False 26764 1726882730.79047: when evaluation is False, skipping this task 26764 1726882730.79050: _execute() done 26764 1726882730.79053: dumping result to json 26764 1726882730.79055: done dumping result, returning 26764 1726882730.79062: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0e448fcc-3ce9-9875-c9a3-0000000001e9] 26764 1726882730.79068: sending task result for task 0e448fcc-3ce9-9875-c9a3-0000000001e9 26764 1726882730.79155: done sending task result for task 0e448fcc-3ce9-9875-c9a3-0000000001e9 26764 1726882730.79158: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 26764 1726882730.79203: no more pending results, returning what we have 26764 1726882730.79206: results queue empty 26764 1726882730.79207: checking for any_errors_fatal 26764 1726882730.79211: done checking for any_errors_fatal 26764 1726882730.79212: checking for max_fail_percentage 26764 1726882730.79214: done checking for max_fail_percentage 26764 1726882730.79214: checking to see if all hosts have failed and the running result is not ok 26764 1726882730.79215: done checking to see if all hosts have failed 26764 1726882730.79216: getting the remaining hosts for this loop 26764 1726882730.79217: done getting the remaining hosts for this loop 26764 1726882730.79220: getting the next task for host managed_node2 26764 1726882730.79225: done getting next task for host managed_node2 26764 1726882730.79228: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 26764 1726882730.79231: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882730.79245: getting variables 26764 1726882730.79246: in VariableManager get_vars() 26764 1726882730.79278: Calling all_inventory to load vars for managed_node2 26764 1726882730.79280: Calling groups_inventory to load vars for managed_node2 26764 1726882730.79281: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882730.79288: Calling all_plugins_play to load vars for managed_node2 26764 1726882730.79290: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882730.79291: Calling groups_plugins_play to load vars for managed_node2 26764 1726882730.80118: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882730.81029: done with get_vars() 26764 1726882730.81042: done getting variables 26764 1726882730.81084: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:38:50 -0400 (0:00:00.029) 0:00:16.752 ****** 26764 1726882730.81110: entering _queue_task() for managed_node2/service 26764 1726882730.81318: worker is 1 (out of 1 available) 26764 1726882730.81328: exiting _queue_task() for managed_node2/service 26764 1726882730.81340: done queuing things up, now waiting for results queue to drain 26764 1726882730.81341: waiting for pending results... 26764 1726882730.81597: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 26764 1726882730.81702: in run() - task 0e448fcc-3ce9-9875-c9a3-0000000001ea 26764 1726882730.81718: variable 'ansible_search_path' from source: unknown 26764 1726882730.81724: variable 'ansible_search_path' from source: unknown 26764 1726882730.81757: calling self._execute() 26764 1726882730.81844: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882730.81853: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882730.81870: variable 'omit' from source: magic vars 26764 1726882730.82194: variable 'ansible_distribution_major_version' from source: facts 26764 1726882730.82211: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882730.82322: variable '__network_wireless_connections_defined' from source: role '' defaults 26764 1726882730.82507: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 26764 1726882730.84162: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 26764 1726882730.84210: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 26764 1726882730.84236: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 26764 1726882730.84260: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 26764 1726882730.84287: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 26764 1726882730.84342: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26764 1726882730.84373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26764 1726882730.84393: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26764 1726882730.84421: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26764 1726882730.84432: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26764 1726882730.84462: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26764 1726882730.84482: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26764 1726882730.84500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26764 1726882730.84527: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26764 1726882730.84537: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26764 1726882730.84564: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26764 1726882730.84582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26764 1726882730.84599: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26764 1726882730.84625: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26764 1726882730.84635: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26764 1726882730.84746: variable 'network_connections' from source: include params 26764 1726882730.84755: variable 'interface' from source: play vars 26764 1726882730.84806: variable 'interface' from source: play vars 26764 1726882730.84854: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 26764 1726882730.84961: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 26764 1726882730.84991: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 26764 1726882730.85012: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 26764 1726882730.85032: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 26764 1726882730.85066: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 26764 1726882730.85084: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 26764 1726882730.85101: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 26764 1726882730.85118: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 26764 1726882730.85162: variable '__network_team_connections_defined' from source: role '' defaults 26764 1726882730.85313: variable 'network_connections' from source: include params 26764 1726882730.85316: variable 'interface' from source: play vars 26764 1726882730.85358: variable 'interface' from source: play vars 26764 1726882730.85388: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 26764 1726882730.85391: when evaluation is False, skipping this task 26764 1726882730.85394: _execute() done 26764 1726882730.85396: dumping result to json 26764 1726882730.85399: done dumping result, returning 26764 1726882730.85405: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-9875-c9a3-0000000001ea] 26764 1726882730.85410: sending task result for task 0e448fcc-3ce9-9875-c9a3-0000000001ea skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 26764 1726882730.85533: no more pending results, returning what we have 26764 1726882730.85536: results queue empty 26764 1726882730.85538: checking for any_errors_fatal 26764 1726882730.85542: done checking for any_errors_fatal 26764 1726882730.85543: checking for max_fail_percentage 26764 1726882730.85545: done checking for max_fail_percentage 26764 1726882730.85546: checking to see if all hosts have failed and the running result is not ok 26764 1726882730.85546: done checking to see if all hosts have failed 26764 1726882730.85547: getting the remaining hosts for this loop 26764 1726882730.85548: done getting the remaining hosts for this loop 26764 1726882730.85551: getting the next task for host managed_node2 26764 1726882730.85557: done getting next task for host managed_node2 26764 1726882730.85561: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 26764 1726882730.85565: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882730.85579: getting variables 26764 1726882730.85580: in VariableManager get_vars() 26764 1726882730.85614: Calling all_inventory to load vars for managed_node2 26764 1726882730.85617: Calling groups_inventory to load vars for managed_node2 26764 1726882730.85619: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882730.85627: Calling all_plugins_play to load vars for managed_node2 26764 1726882730.85629: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882730.85632: Calling groups_plugins_play to load vars for managed_node2 26764 1726882730.86391: done sending task result for task 0e448fcc-3ce9-9875-c9a3-0000000001ea 26764 1726882730.86394: WORKER PROCESS EXITING 26764 1726882730.86404: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882730.90355: done with get_vars() 26764 1726882730.90374: done getting variables 26764 1726882730.90406: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:38:50 -0400 (0:00:00.093) 0:00:16.845 ****** 26764 1726882730.90423: entering _queue_task() for managed_node2/service 26764 1726882730.90637: worker is 1 (out of 1 available) 26764 1726882730.90648: exiting _queue_task() for managed_node2/service 26764 1726882730.90661: done queuing things up, now waiting for results queue to drain 26764 1726882730.90662: waiting for pending results... 26764 1726882730.90832: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 26764 1726882730.90912: in run() - task 0e448fcc-3ce9-9875-c9a3-0000000001eb 26764 1726882730.90922: variable 'ansible_search_path' from source: unknown 26764 1726882730.90926: variable 'ansible_search_path' from source: unknown 26764 1726882730.90952: calling self._execute() 26764 1726882730.91024: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882730.91029: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882730.91038: variable 'omit' from source: magic vars 26764 1726882730.91300: variable 'ansible_distribution_major_version' from source: facts 26764 1726882730.91312: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882730.91417: variable 'network_provider' from source: set_fact 26764 1726882730.91422: variable 'network_state' from source: role '' defaults 26764 1726882730.91429: Evaluated conditional (network_provider == "nm" or network_state != {}): True 26764 1726882730.91436: variable 'omit' from source: magic vars 26764 1726882730.91470: variable 'omit' from source: magic vars 26764 1726882730.91488: variable 'network_service_name' from source: role '' defaults 26764 1726882730.91538: variable 'network_service_name' from source: role '' defaults 26764 1726882730.91612: variable '__network_provider_setup' from source: role '' defaults 26764 1726882730.91616: variable '__network_service_name_default_nm' from source: role '' defaults 26764 1726882730.91665: variable '__network_service_name_default_nm' from source: role '' defaults 26764 1726882730.91673: variable '__network_packages_default_nm' from source: role '' defaults 26764 1726882730.91716: variable '__network_packages_default_nm' from source: role '' defaults 26764 1726882730.91861: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 26764 1726882730.93365: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 26764 1726882730.93418: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 26764 1726882730.93444: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 26764 1726882730.93472: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 26764 1726882730.93495: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 26764 1726882730.93548: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26764 1726882730.93569: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26764 1726882730.93588: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26764 1726882730.93616: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26764 1726882730.93627: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26764 1726882730.93657: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26764 1726882730.93676: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26764 1726882730.93693: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26764 1726882730.93718: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26764 1726882730.93734: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26764 1726882730.93881: variable '__network_packages_default_gobject_packages' from source: role '' defaults 26764 1726882730.93953: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26764 1726882730.93972: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26764 1726882730.93988: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26764 1726882730.94012: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26764 1726882730.94023: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26764 1726882730.94086: variable 'ansible_python' from source: facts 26764 1726882730.94102: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 26764 1726882730.94156: variable '__network_wpa_supplicant_required' from source: role '' defaults 26764 1726882730.94212: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 26764 1726882730.94293: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26764 1726882730.94309: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26764 1726882730.94325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26764 1726882730.94349: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26764 1726882730.94360: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26764 1726882730.94396: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26764 1726882730.94416: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26764 1726882730.94432: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26764 1726882730.94456: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26764 1726882730.94469: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26764 1726882730.94557: variable 'network_connections' from source: include params 26764 1726882730.94563: variable 'interface' from source: play vars 26764 1726882730.94618: variable 'interface' from source: play vars 26764 1726882730.94687: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 26764 1726882730.94810: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 26764 1726882730.94844: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 26764 1726882730.94876: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 26764 1726882730.94905: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 26764 1726882730.94949: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 26764 1726882730.94970: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 26764 1726882730.94996: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 26764 1726882730.95022: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 26764 1726882730.95054: variable '__network_wireless_connections_defined' from source: role '' defaults 26764 1726882730.95225: variable 'network_connections' from source: include params 26764 1726882730.95230: variable 'interface' from source: play vars 26764 1726882730.95285: variable 'interface' from source: play vars 26764 1726882730.95318: variable '__network_packages_default_wireless' from source: role '' defaults 26764 1726882730.95373: variable '__network_wireless_connections_defined' from source: role '' defaults 26764 1726882730.95550: variable 'network_connections' from source: include params 26764 1726882730.95553: variable 'interface' from source: play vars 26764 1726882730.95607: variable 'interface' from source: play vars 26764 1726882730.95624: variable '__network_packages_default_team' from source: role '' defaults 26764 1726882730.95683: variable '__network_team_connections_defined' from source: role '' defaults 26764 1726882730.95855: variable 'network_connections' from source: include params 26764 1726882730.95859: variable 'interface' from source: play vars 26764 1726882730.95911: variable 'interface' from source: play vars 26764 1726882730.95952: variable '__network_service_name_default_initscripts' from source: role '' defaults 26764 1726882730.95999: variable '__network_service_name_default_initscripts' from source: role '' defaults 26764 1726882730.96007: variable '__network_packages_default_initscripts' from source: role '' defaults 26764 1726882730.96048: variable '__network_packages_default_initscripts' from source: role '' defaults 26764 1726882730.96182: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 26764 1726882730.96500: variable 'network_connections' from source: include params 26764 1726882730.96503: variable 'interface' from source: play vars 26764 1726882730.96545: variable 'interface' from source: play vars 26764 1726882730.96552: variable 'ansible_distribution' from source: facts 26764 1726882730.96555: variable '__network_rh_distros' from source: role '' defaults 26764 1726882730.96562: variable 'ansible_distribution_major_version' from source: facts 26764 1726882730.96588: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 26764 1726882730.96712: variable 'ansible_distribution' from source: facts 26764 1726882730.96716: variable '__network_rh_distros' from source: role '' defaults 26764 1726882730.96718: variable 'ansible_distribution_major_version' from source: facts 26764 1726882730.96727: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 26764 1726882730.96838: variable 'ansible_distribution' from source: facts 26764 1726882730.96841: variable '__network_rh_distros' from source: role '' defaults 26764 1726882730.96844: variable 'ansible_distribution_major_version' from source: facts 26764 1726882730.96884: variable 'network_provider' from source: set_fact 26764 1726882730.96900: variable 'omit' from source: magic vars 26764 1726882730.96920: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26764 1726882730.96939: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26764 1726882730.96953: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26764 1726882730.96968: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882730.96978: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882730.97001: variable 'inventory_hostname' from source: host vars for 'managed_node2' 26764 1726882730.97004: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882730.97008: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882730.97074: Set connection var ansible_shell_executable to /bin/sh 26764 1726882730.97077: Set connection var ansible_shell_type to sh 26764 1726882730.97085: Set connection var ansible_module_compression to ZIP_DEFLATED 26764 1726882730.97090: Set connection var ansible_timeout to 10 26764 1726882730.97097: Set connection var ansible_connection to ssh 26764 1726882730.97100: Set connection var ansible_pipelining to False 26764 1726882730.97118: variable 'ansible_shell_executable' from source: unknown 26764 1726882730.97120: variable 'ansible_connection' from source: unknown 26764 1726882730.97123: variable 'ansible_module_compression' from source: unknown 26764 1726882730.97125: variable 'ansible_shell_type' from source: unknown 26764 1726882730.97127: variable 'ansible_shell_executable' from source: unknown 26764 1726882730.97130: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882730.97134: variable 'ansible_pipelining' from source: unknown 26764 1726882730.97136: variable 'ansible_timeout' from source: unknown 26764 1726882730.97140: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882730.97222: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 26764 1726882730.97229: variable 'omit' from source: magic vars 26764 1726882730.97232: starting attempt loop 26764 1726882730.97235: running the handler 26764 1726882730.97289: variable 'ansible_facts' from source: unknown 26764 1726882730.97780: _low_level_execute_command(): starting 26764 1726882730.97786: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 26764 1726882730.98580: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882730.98618: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882730.98756: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882731.00420: stdout chunk (state=3): >>>/root <<< 26764 1726882731.00516: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882731.00586: stderr chunk (state=3): >>><<< 26764 1726882731.00592: stdout chunk (state=3): >>><<< 26764 1726882731.00612: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882731.00625: _low_level_execute_command(): starting 26764 1726882731.00639: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882731.0061038-27500-235690876483177 `" && echo ansible-tmp-1726882731.0061038-27500-235690876483177="` echo /root/.ansible/tmp/ansible-tmp-1726882731.0061038-27500-235690876483177 `" ) && sleep 0' 26764 1726882731.01093: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882731.01097: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882731.01132: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882731.01136: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882731.01138: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 26764 1726882731.01141: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882731.01187: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882731.01195: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882731.01309: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882731.03190: stdout chunk (state=3): >>>ansible-tmp-1726882731.0061038-27500-235690876483177=/root/.ansible/tmp/ansible-tmp-1726882731.0061038-27500-235690876483177 <<< 26764 1726882731.03388: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882731.03391: stdout chunk (state=3): >>><<< 26764 1726882731.03393: stderr chunk (state=3): >>><<< 26764 1726882731.03476: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882731.0061038-27500-235690876483177=/root/.ansible/tmp/ansible-tmp-1726882731.0061038-27500-235690876483177 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882731.03480: variable 'ansible_module_compression' from source: unknown 26764 1726882731.03774: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-26764trh16hvb/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 26764 1726882731.03777: variable 'ansible_facts' from source: unknown 26764 1726882731.03779: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882731.0061038-27500-235690876483177/AnsiballZ_systemd.py 26764 1726882731.03916: Sending initial data 26764 1726882731.03919: Sent initial data (156 bytes) 26764 1726882731.04922: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26764 1726882731.04937: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882731.04953: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882731.04979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882731.05021: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882731.05033: stderr chunk (state=3): >>>debug2: match not found <<< 26764 1726882731.05051: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882731.05075: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882731.05094: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882731.05107: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26764 1726882731.05119: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882731.05133: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882731.05149: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882731.05161: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882731.05178: stderr chunk (state=3): >>>debug2: match found <<< 26764 1726882731.05192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882731.05276: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882731.05299: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882731.05322: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882731.05451: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882731.07199: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 26764 1726882731.07298: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 26764 1726882731.07401: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-26764trh16hvb/tmpylgkar7q /root/.ansible/tmp/ansible-tmp-1726882731.0061038-27500-235690876483177/AnsiballZ_systemd.py <<< 26764 1726882731.07498: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 26764 1726882731.10295: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882731.10551: stderr chunk (state=3): >>><<< 26764 1726882731.10555: stdout chunk (state=3): >>><<< 26764 1726882731.10557: done transferring module to remote 26764 1726882731.10559: _low_level_execute_command(): starting 26764 1726882731.10562: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882731.0061038-27500-235690876483177/ /root/.ansible/tmp/ansible-tmp-1726882731.0061038-27500-235690876483177/AnsiballZ_systemd.py && sleep 0' 26764 1726882731.11149: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26764 1726882731.11165: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882731.11181: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882731.11199: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882731.11247: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882731.11260: stderr chunk (state=3): >>>debug2: match not found <<< 26764 1726882731.11277: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882731.11297: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882731.11309: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882731.11321: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26764 1726882731.11340: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882731.11355: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882731.11373: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882731.11386: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882731.11399: stderr chunk (state=3): >>>debug2: match found <<< 26764 1726882731.11412: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882731.11496: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882731.11518: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882731.11535: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882731.11666: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882731.13499: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882731.13578: stderr chunk (state=3): >>><<< 26764 1726882731.13588: stdout chunk (state=3): >>><<< 26764 1726882731.13686: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882731.13689: _low_level_execute_command(): starting 26764 1726882731.13692: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882731.0061038-27500-235690876483177/AnsiballZ_systemd.py && sleep 0' 26764 1726882731.14256: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26764 1726882731.14274: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882731.14290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882731.14307: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882731.14354: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882731.14370: stderr chunk (state=3): >>>debug2: match not found <<< 26764 1726882731.14385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882731.14403: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882731.14416: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882731.14432: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26764 1726882731.14451: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882731.14468: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882731.14485: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882731.14498: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882731.14510: stderr chunk (state=3): >>>debug2: match found <<< 26764 1726882731.14523: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882731.14605: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882731.14626: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882731.14641: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882731.14777: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882731.39908: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6692", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ExecMainStartTimestampMonotonic": "202392137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6692", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3602", "MemoryCurrent": "9146368", "MemoryAvailable": "infinity", "CPUUsageNSec": "1841565000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": <<< 26764 1726882731.39927: stdout chunk (state=3): >>>"0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service multi-user.target network.target shutdown.target cloud-init.service", "After": "cloud-init-local.service dbus-broker.service network-pre.target system.slice dbus.socket systemd-journald.socket basic.target sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:57 EDT", "StateChangeTimestampMonotonic": "316658837", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveExitTimestampMonotonic": "202392395", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveEnterTimestampMonotonic": "202472383", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveExitTimestampMonotonic": "202362940", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveEnterTimestampMonotonic": "202381901", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ConditionTimestampMonotonic": "202382734", "AssertTimestamp": "Fri 2024-09-20 21:31:03 EDT", "AssertTimestampMonotonic": "202382737", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "55e27919215348fab37a11b7ea324f90", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 26764 1726882731.41482: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 26764 1726882731.41497: stderr chunk (state=3): >>><<< 26764 1726882731.41500: stdout chunk (state=3): >>><<< 26764 1726882731.41521: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6692", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ExecMainStartTimestampMonotonic": "202392137", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6692", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3602", "MemoryCurrent": "9146368", "MemoryAvailable": "infinity", "CPUUsageNSec": "1841565000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service network.service multi-user.target network.target shutdown.target cloud-init.service", "After": "cloud-init-local.service dbus-broker.service network-pre.target system.slice dbus.socket systemd-journald.socket basic.target sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:57 EDT", "StateChangeTimestampMonotonic": "316658837", "InactiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveExitTimestampMonotonic": "202392395", "ActiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveEnterTimestampMonotonic": "202472383", "ActiveExitTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ActiveExitTimestampMonotonic": "202362940", "InactiveEnterTimestamp": "Fri 2024-09-20 21:31:03 EDT", "InactiveEnterTimestampMonotonic": "202381901", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:31:03 EDT", "ConditionTimestampMonotonic": "202382734", "AssertTimestamp": "Fri 2024-09-20 21:31:03 EDT", "AssertTimestampMonotonic": "202382737", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "55e27919215348fab37a11b7ea324f90", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 26764 1726882731.41699: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882731.0061038-27500-235690876483177/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 26764 1726882731.41717: _low_level_execute_command(): starting 26764 1726882731.41720: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882731.0061038-27500-235690876483177/ > /dev/null 2>&1 && sleep 0' 26764 1726882731.43271: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882731.43274: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882731.43316: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 26764 1726882731.43319: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882731.43337: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882731.43343: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 26764 1726882731.43356: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882731.43439: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882731.43559: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882731.43569: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882731.43698: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882731.45587: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882731.45591: stderr chunk (state=3): >>><<< 26764 1726882731.45596: stdout chunk (state=3): >>><<< 26764 1726882731.45611: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882731.45618: handler run complete 26764 1726882731.45683: attempt loop complete, returning result 26764 1726882731.45686: _execute() done 26764 1726882731.45688: dumping result to json 26764 1726882731.45707: done dumping result, returning 26764 1726882731.45717: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0e448fcc-3ce9-9875-c9a3-0000000001eb] 26764 1726882731.45722: sending task result for task 0e448fcc-3ce9-9875-c9a3-0000000001eb 26764 1726882731.46001: done sending task result for task 0e448fcc-3ce9-9875-c9a3-0000000001eb 26764 1726882731.46004: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 26764 1726882731.46050: no more pending results, returning what we have 26764 1726882731.46054: results queue empty 26764 1726882731.46055: checking for any_errors_fatal 26764 1726882731.46059: done checking for any_errors_fatal 26764 1726882731.46060: checking for max_fail_percentage 26764 1726882731.46062: done checking for max_fail_percentage 26764 1726882731.46063: checking to see if all hosts have failed and the running result is not ok 26764 1726882731.46067: done checking to see if all hosts have failed 26764 1726882731.46068: getting the remaining hosts for this loop 26764 1726882731.46069: done getting the remaining hosts for this loop 26764 1726882731.46073: getting the next task for host managed_node2 26764 1726882731.46080: done getting next task for host managed_node2 26764 1726882731.46084: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 26764 1726882731.46086: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882731.46096: getting variables 26764 1726882731.46098: in VariableManager get_vars() 26764 1726882731.46129: Calling all_inventory to load vars for managed_node2 26764 1726882731.46132: Calling groups_inventory to load vars for managed_node2 26764 1726882731.46134: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882731.46143: Calling all_plugins_play to load vars for managed_node2 26764 1726882731.46145: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882731.46147: Calling groups_plugins_play to load vars for managed_node2 26764 1726882731.47570: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882731.49330: done with get_vars() 26764 1726882731.49353: done getting variables 26764 1726882731.49420: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:38:51 -0400 (0:00:00.590) 0:00:17.436 ****** 26764 1726882731.49453: entering _queue_task() for managed_node2/service 26764 1726882731.49783: worker is 1 (out of 1 available) 26764 1726882731.49795: exiting _queue_task() for managed_node2/service 26764 1726882731.49809: done queuing things up, now waiting for results queue to drain 26764 1726882731.49810: waiting for pending results... 26764 1726882731.50105: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 26764 1726882731.50224: in run() - task 0e448fcc-3ce9-9875-c9a3-0000000001ec 26764 1726882731.50240: variable 'ansible_search_path' from source: unknown 26764 1726882731.50248: variable 'ansible_search_path' from source: unknown 26764 1726882731.50299: calling self._execute() 26764 1726882731.50394: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882731.50407: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882731.50419: variable 'omit' from source: magic vars 26764 1726882731.50811: variable 'ansible_distribution_major_version' from source: facts 26764 1726882731.50825: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882731.50940: variable 'network_provider' from source: set_fact 26764 1726882731.50956: Evaluated conditional (network_provider == "nm"): True 26764 1726882731.51056: variable '__network_wpa_supplicant_required' from source: role '' defaults 26764 1726882731.51162: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 26764 1726882731.51361: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 26764 1726882731.54021: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 26764 1726882731.54096: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 26764 1726882731.54141: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 26764 1726882731.54185: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 26764 1726882731.54217: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 26764 1726882731.54307: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26764 1726882731.54346: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26764 1726882731.54381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26764 1726882731.54430: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26764 1726882731.54455: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26764 1726882731.54511: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26764 1726882731.54540: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26764 1726882731.54579: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26764 1726882731.54628: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26764 1726882731.54649: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26764 1726882731.54703: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26764 1726882731.54737: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26764 1726882731.54777: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26764 1726882731.54824: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26764 1726882731.54851: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26764 1726882731.55019: variable 'network_connections' from source: include params 26764 1726882731.55040: variable 'interface' from source: play vars 26764 1726882731.55139: variable 'interface' from source: play vars 26764 1726882731.55231: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 26764 1726882731.55448: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 26764 1726882731.55508: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 26764 1726882731.55558: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 26764 1726882731.55596: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 26764 1726882731.55644: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 26764 1726882731.55679: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 26764 1726882731.55712: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 26764 1726882731.55742: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 26764 1726882731.55800: variable '__network_wireless_connections_defined' from source: role '' defaults 26764 1726882731.56057: variable 'network_connections' from source: include params 26764 1726882731.56072: variable 'interface' from source: play vars 26764 1726882731.56143: variable 'interface' from source: play vars 26764 1726882731.56188: Evaluated conditional (__network_wpa_supplicant_required): False 26764 1726882731.56200: when evaluation is False, skipping this task 26764 1726882731.56207: _execute() done 26764 1726882731.56213: dumping result to json 26764 1726882731.56219: done dumping result, returning 26764 1726882731.56230: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0e448fcc-3ce9-9875-c9a3-0000000001ec] 26764 1726882731.56251: sending task result for task 0e448fcc-3ce9-9875-c9a3-0000000001ec skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 26764 1726882731.56419: no more pending results, returning what we have 26764 1726882731.56424: results queue empty 26764 1726882731.56425: checking for any_errors_fatal 26764 1726882731.56441: done checking for any_errors_fatal 26764 1726882731.56442: checking for max_fail_percentage 26764 1726882731.56444: done checking for max_fail_percentage 26764 1726882731.56445: checking to see if all hosts have failed and the running result is not ok 26764 1726882731.56445: done checking to see if all hosts have failed 26764 1726882731.56446: getting the remaining hosts for this loop 26764 1726882731.56448: done getting the remaining hosts for this loop 26764 1726882731.56452: getting the next task for host managed_node2 26764 1726882731.56460: done getting next task for host managed_node2 26764 1726882731.56468: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 26764 1726882731.56471: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882731.56487: getting variables 26764 1726882731.56488: in VariableManager get_vars() 26764 1726882731.56532: Calling all_inventory to load vars for managed_node2 26764 1726882731.56535: Calling groups_inventory to load vars for managed_node2 26764 1726882731.56538: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882731.56550: Calling all_plugins_play to load vars for managed_node2 26764 1726882731.56553: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882731.56556: Calling groups_plugins_play to load vars for managed_node2 26764 1726882731.57506: done sending task result for task 0e448fcc-3ce9-9875-c9a3-0000000001ec 26764 1726882731.57510: WORKER PROCESS EXITING 26764 1726882731.58540: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882731.60324: done with get_vars() 26764 1726882731.60349: done getting variables 26764 1726882731.60409: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:38:51 -0400 (0:00:00.109) 0:00:17.546 ****** 26764 1726882731.60438: entering _queue_task() for managed_node2/service 26764 1726882731.60735: worker is 1 (out of 1 available) 26764 1726882731.60747: exiting _queue_task() for managed_node2/service 26764 1726882731.60758: done queuing things up, now waiting for results queue to drain 26764 1726882731.60758: waiting for pending results... 26764 1726882731.61040: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 26764 1726882731.61163: in run() - task 0e448fcc-3ce9-9875-c9a3-0000000001ed 26764 1726882731.61185: variable 'ansible_search_path' from source: unknown 26764 1726882731.61193: variable 'ansible_search_path' from source: unknown 26764 1726882731.61236: calling self._execute() 26764 1726882731.61339: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882731.61350: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882731.61363: variable 'omit' from source: magic vars 26764 1726882731.61757: variable 'ansible_distribution_major_version' from source: facts 26764 1726882731.61778: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882731.61903: variable 'network_provider' from source: set_fact 26764 1726882731.61913: Evaluated conditional (network_provider == "initscripts"): False 26764 1726882731.61920: when evaluation is False, skipping this task 26764 1726882731.61926: _execute() done 26764 1726882731.61932: dumping result to json 26764 1726882731.61939: done dumping result, returning 26764 1726882731.61948: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [0e448fcc-3ce9-9875-c9a3-0000000001ed] 26764 1726882731.61962: sending task result for task 0e448fcc-3ce9-9875-c9a3-0000000001ed skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 26764 1726882731.62109: no more pending results, returning what we have 26764 1726882731.62113: results queue empty 26764 1726882731.62114: checking for any_errors_fatal 26764 1726882731.62122: done checking for any_errors_fatal 26764 1726882731.62123: checking for max_fail_percentage 26764 1726882731.62125: done checking for max_fail_percentage 26764 1726882731.62126: checking to see if all hosts have failed and the running result is not ok 26764 1726882731.62126: done checking to see if all hosts have failed 26764 1726882731.62127: getting the remaining hosts for this loop 26764 1726882731.62129: done getting the remaining hosts for this loop 26764 1726882731.62132: getting the next task for host managed_node2 26764 1726882731.62140: done getting next task for host managed_node2 26764 1726882731.62143: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 26764 1726882731.62146: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882731.62167: getting variables 26764 1726882731.62169: in VariableManager get_vars() 26764 1726882731.62209: Calling all_inventory to load vars for managed_node2 26764 1726882731.62212: Calling groups_inventory to load vars for managed_node2 26764 1726882731.62214: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882731.62227: Calling all_plugins_play to load vars for managed_node2 26764 1726882731.62231: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882731.62234: Calling groups_plugins_play to load vars for managed_node2 26764 1726882731.63225: done sending task result for task 0e448fcc-3ce9-9875-c9a3-0000000001ed 26764 1726882731.63229: WORKER PROCESS EXITING 26764 1726882731.63930: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882731.65731: done with get_vars() 26764 1726882731.65751: done getting variables 26764 1726882731.65813: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:38:51 -0400 (0:00:00.054) 0:00:17.600 ****** 26764 1726882731.65843: entering _queue_task() for managed_node2/copy 26764 1726882731.66090: worker is 1 (out of 1 available) 26764 1726882731.66101: exiting _queue_task() for managed_node2/copy 26764 1726882731.66113: done queuing things up, now waiting for results queue to drain 26764 1726882731.66114: waiting for pending results... 26764 1726882731.66387: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 26764 1726882731.66517: in run() - task 0e448fcc-3ce9-9875-c9a3-0000000001ee 26764 1726882731.66535: variable 'ansible_search_path' from source: unknown 26764 1726882731.66542: variable 'ansible_search_path' from source: unknown 26764 1726882731.66589: calling self._execute() 26764 1726882731.66686: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882731.66697: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882731.66710: variable 'omit' from source: magic vars 26764 1726882731.67070: variable 'ansible_distribution_major_version' from source: facts 26764 1726882731.67089: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882731.67214: variable 'network_provider' from source: set_fact 26764 1726882731.67228: Evaluated conditional (network_provider == "initscripts"): False 26764 1726882731.67235: when evaluation is False, skipping this task 26764 1726882731.67241: _execute() done 26764 1726882731.67247: dumping result to json 26764 1726882731.67254: done dumping result, returning 26764 1726882731.67268: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0e448fcc-3ce9-9875-c9a3-0000000001ee] 26764 1726882731.67280: sending task result for task 0e448fcc-3ce9-9875-c9a3-0000000001ee skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 26764 1726882731.67418: no more pending results, returning what we have 26764 1726882731.67422: results queue empty 26764 1726882731.67423: checking for any_errors_fatal 26764 1726882731.67429: done checking for any_errors_fatal 26764 1726882731.67429: checking for max_fail_percentage 26764 1726882731.67431: done checking for max_fail_percentage 26764 1726882731.67432: checking to see if all hosts have failed and the running result is not ok 26764 1726882731.67433: done checking to see if all hosts have failed 26764 1726882731.67434: getting the remaining hosts for this loop 26764 1726882731.67435: done getting the remaining hosts for this loop 26764 1726882731.67438: getting the next task for host managed_node2 26764 1726882731.67445: done getting next task for host managed_node2 26764 1726882731.67448: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 26764 1726882731.67452: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882731.67471: getting variables 26764 1726882731.67473: in VariableManager get_vars() 26764 1726882731.67512: Calling all_inventory to load vars for managed_node2 26764 1726882731.67514: Calling groups_inventory to load vars for managed_node2 26764 1726882731.67517: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882731.67529: Calling all_plugins_play to load vars for managed_node2 26764 1726882731.67532: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882731.67535: Calling groups_plugins_play to load vars for managed_node2 26764 1726882731.68605: done sending task result for task 0e448fcc-3ce9-9875-c9a3-0000000001ee 26764 1726882731.68608: WORKER PROCESS EXITING 26764 1726882731.69319: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882731.71086: done with get_vars() 26764 1726882731.71110: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:38:51 -0400 (0:00:00.053) 0:00:17.653 ****** 26764 1726882731.71189: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 26764 1726882731.71429: worker is 1 (out of 1 available) 26764 1726882731.71445: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 26764 1726882731.71456: done queuing things up, now waiting for results queue to drain 26764 1726882731.71457: waiting for pending results... 26764 1726882731.71725: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 26764 1726882731.71854: in run() - task 0e448fcc-3ce9-9875-c9a3-0000000001ef 26764 1726882731.71880: variable 'ansible_search_path' from source: unknown 26764 1726882731.71887: variable 'ansible_search_path' from source: unknown 26764 1726882731.71927: calling self._execute() 26764 1726882731.72024: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882731.72035: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882731.72048: variable 'omit' from source: magic vars 26764 1726882731.72423: variable 'ansible_distribution_major_version' from source: facts 26764 1726882731.72443: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882731.72453: variable 'omit' from source: magic vars 26764 1726882731.72504: variable 'omit' from source: magic vars 26764 1726882731.72673: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 26764 1726882731.75055: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 26764 1726882731.75133: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 26764 1726882731.75179: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 26764 1726882731.75215: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 26764 1726882731.75250: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 26764 1726882731.75333: variable 'network_provider' from source: set_fact 26764 1726882731.75479: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 26764 1726882731.75510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 26764 1726882731.75539: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 26764 1726882731.75595: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 26764 1726882731.75614: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 26764 1726882731.75698: variable 'omit' from source: magic vars 26764 1726882731.75816: variable 'omit' from source: magic vars 26764 1726882731.75928: variable 'network_connections' from source: include params 26764 1726882731.75942: variable 'interface' from source: play vars 26764 1726882731.76020: variable 'interface' from source: play vars 26764 1726882731.76184: variable 'omit' from source: magic vars 26764 1726882731.76196: variable '__lsr_ansible_managed' from source: task vars 26764 1726882731.76270: variable '__lsr_ansible_managed' from source: task vars 26764 1726882731.76470: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 26764 1726882731.76701: Loaded config def from plugin (lookup/template) 26764 1726882731.76710: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 26764 1726882731.76740: File lookup term: get_ansible_managed.j2 26764 1726882731.76746: variable 'ansible_search_path' from source: unknown 26764 1726882731.76761: evaluation_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 26764 1726882731.76784: search_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 26764 1726882731.76805: variable 'ansible_search_path' from source: unknown 26764 1726882731.83441: variable 'ansible_managed' from source: unknown 26764 1726882731.83588: variable 'omit' from source: magic vars 26764 1726882731.83627: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26764 1726882731.83657: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26764 1726882731.83684: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26764 1726882731.83706: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882731.83723: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882731.83758: variable 'inventory_hostname' from source: host vars for 'managed_node2' 26764 1726882731.83772: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882731.83782: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882731.83888: Set connection var ansible_shell_executable to /bin/sh 26764 1726882731.83896: Set connection var ansible_shell_type to sh 26764 1726882731.83912: Set connection var ansible_module_compression to ZIP_DEFLATED 26764 1726882731.83922: Set connection var ansible_timeout to 10 26764 1726882731.83932: Set connection var ansible_connection to ssh 26764 1726882731.83941: Set connection var ansible_pipelining to False 26764 1726882731.83979: variable 'ansible_shell_executable' from source: unknown 26764 1726882731.83989: variable 'ansible_connection' from source: unknown 26764 1726882731.83996: variable 'ansible_module_compression' from source: unknown 26764 1726882731.84002: variable 'ansible_shell_type' from source: unknown 26764 1726882731.84008: variable 'ansible_shell_executable' from source: unknown 26764 1726882731.84014: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882731.84022: variable 'ansible_pipelining' from source: unknown 26764 1726882731.84028: variable 'ansible_timeout' from source: unknown 26764 1726882731.84035: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882731.84178: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 26764 1726882731.84201: variable 'omit' from source: magic vars 26764 1726882731.84213: starting attempt loop 26764 1726882731.84220: running the handler 26764 1726882731.84238: _low_level_execute_command(): starting 26764 1726882731.84248: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 26764 1726882731.85033: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26764 1726882731.85054: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882731.85075: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882731.85095: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882731.85136: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882731.85149: stderr chunk (state=3): >>>debug2: match not found <<< 26764 1726882731.85172: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882731.85190: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882731.85201: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882731.85212: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26764 1726882731.85225: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882731.85239: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882731.85255: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882731.85285: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882731.85288: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882731.85332: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882731.85348: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882731.85476: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882731.87143: stdout chunk (state=3): >>>/root <<< 26764 1726882731.87292: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882731.87317: stderr chunk (state=3): >>><<< 26764 1726882731.87323: stdout chunk (state=3): >>><<< 26764 1726882731.87353: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882731.87357: _low_level_execute_command(): starting 26764 1726882731.87368: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882731.8734622-27530-139287272361064 `" && echo ansible-tmp-1726882731.8734622-27530-139287272361064="` echo /root/.ansible/tmp/ansible-tmp-1726882731.8734622-27530-139287272361064 `" ) && sleep 0' 26764 1726882731.87950: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26764 1726882731.87960: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882731.87975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882731.87985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882731.88020: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882731.88027: stderr chunk (state=3): >>>debug2: match not found <<< 26764 1726882731.88036: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882731.88048: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882731.88055: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882731.88061: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26764 1726882731.88070: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882731.88081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882731.88094: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882731.88101: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882731.88107: stderr chunk (state=3): >>>debug2: match found <<< 26764 1726882731.88116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882731.88186: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882731.88209: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882731.88212: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882731.88324: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882731.90240: stdout chunk (state=3): >>>ansible-tmp-1726882731.8734622-27530-139287272361064=/root/.ansible/tmp/ansible-tmp-1726882731.8734622-27530-139287272361064 <<< 26764 1726882731.90346: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882731.90416: stderr chunk (state=3): >>><<< 26764 1726882731.90425: stdout chunk (state=3): >>><<< 26764 1726882731.90469: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882731.8734622-27530-139287272361064=/root/.ansible/tmp/ansible-tmp-1726882731.8734622-27530-139287272361064 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882731.90789: variable 'ansible_module_compression' from source: unknown 26764 1726882731.90793: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-26764trh16hvb/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 26764 1726882731.90795: variable 'ansible_facts' from source: unknown 26764 1726882731.90797: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882731.8734622-27530-139287272361064/AnsiballZ_network_connections.py 26764 1726882731.90854: Sending initial data 26764 1726882731.90857: Sent initial data (168 bytes) 26764 1726882731.91794: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26764 1726882731.91810: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882731.91822: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882731.91840: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882731.91886: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882731.91898: stderr chunk (state=3): >>>debug2: match not found <<< 26764 1726882731.91912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882731.91929: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882731.91940: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882731.91951: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26764 1726882731.91966: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882731.91985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882731.92003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882731.92015: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882731.92028: stderr chunk (state=3): >>>debug2: match found <<< 26764 1726882731.92042: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882731.92124: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882731.92145: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882731.92161: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882731.92294: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882731.94074: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 26764 1726882731.94169: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 26764 1726882731.94270: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-26764trh16hvb/tmpguwhpkp6 /root/.ansible/tmp/ansible-tmp-1726882731.8734622-27530-139287272361064/AnsiballZ_network_connections.py <<< 26764 1726882731.94369: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 26764 1726882731.96149: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882731.96284: stderr chunk (state=3): >>><<< 26764 1726882731.96287: stdout chunk (state=3): >>><<< 26764 1726882731.96307: done transferring module to remote 26764 1726882731.96318: _low_level_execute_command(): starting 26764 1726882731.96321: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882731.8734622-27530-139287272361064/ /root/.ansible/tmp/ansible-tmp-1726882731.8734622-27530-139287272361064/AnsiballZ_network_connections.py && sleep 0' 26764 1726882731.96916: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26764 1726882731.96925: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882731.96935: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882731.96948: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882731.96989: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882731.97001: stderr chunk (state=3): >>>debug2: match not found <<< 26764 1726882731.97004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882731.97018: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882731.97027: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882731.97030: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26764 1726882731.97038: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882731.97047: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882731.97059: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882731.97069: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882731.97079: stderr chunk (state=3): >>>debug2: match found <<< 26764 1726882731.97088: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882731.97157: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882731.97178: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882731.97191: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882731.97312: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882731.99081: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882731.99153: stderr chunk (state=3): >>><<< 26764 1726882731.99167: stdout chunk (state=3): >>><<< 26764 1726882731.99261: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882731.99270: _low_level_execute_command(): starting 26764 1726882731.99273: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882731.8734622-27530-139287272361064/AnsiballZ_network_connections.py && sleep 0' 26764 1726882731.99810: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26764 1726882731.99830: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882731.99843: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882731.99858: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882731.99905: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882731.99917: stderr chunk (state=3): >>>debug2: match not found <<< 26764 1726882731.99932: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882731.99955: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882731.99969: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882731.99980: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26764 1726882731.99991: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882732.00004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882732.00019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882732.00030: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882732.00042: stderr chunk (state=3): >>>debug2: match found <<< 26764 1726882732.00060: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882732.00136: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882732.00162: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882732.00180: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882732.00309: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882732.27038: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'rpltstbr': add connection rpltstbr, 19accc22-6d88-4131-bb4c-c8b05397a153\n[004] #0, state:up persistent_state:present, 'rpltstbr': up connection rpltstbr, 19accc22-6d88-4131-bb4c-c8b05397a153 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "rpltstbr", "state": "up", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "rpltstbr", "state": "up", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 26764 1726882732.29375: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 26764 1726882732.29428: stderr chunk (state=3): >>><<< 26764 1726882732.29432: stdout chunk (state=3): >>><<< 26764 1726882732.29448: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'rpltstbr': add connection rpltstbr, 19accc22-6d88-4131-bb4c-c8b05397a153\n[004] #0, state:up persistent_state:present, 'rpltstbr': up connection rpltstbr, 19accc22-6d88-4131-bb4c-c8b05397a153 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "rpltstbr", "state": "up", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "rpltstbr", "state": "up", "type": "bridge", "ip": {"dhcp4": false, "auto6": false}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 26764 1726882732.29482: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'rpltstbr', 'state': 'up', 'type': 'bridge', 'ip': {'dhcp4': False, 'auto6': False}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882731.8734622-27530-139287272361064/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 26764 1726882732.29489: _low_level_execute_command(): starting 26764 1726882732.29494: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882731.8734622-27530-139287272361064/ > /dev/null 2>&1 && sleep 0' 26764 1726882732.29933: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882732.29937: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882732.29974: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882732.29983: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882732.29989: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882732.30002: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882732.30011: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882732.30063: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882732.30076: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882732.30079: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882732.30203: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882732.32028: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882732.32075: stderr chunk (state=3): >>><<< 26764 1726882732.32078: stdout chunk (state=3): >>><<< 26764 1726882732.32090: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882732.32096: handler run complete 26764 1726882732.32119: attempt loop complete, returning result 26764 1726882732.32122: _execute() done 26764 1726882732.32125: dumping result to json 26764 1726882732.32130: done dumping result, returning 26764 1726882732.32139: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0e448fcc-3ce9-9875-c9a3-0000000001ef] 26764 1726882732.32143: sending task result for task 0e448fcc-3ce9-9875-c9a3-0000000001ef 26764 1726882732.32245: done sending task result for task 0e448fcc-3ce9-9875-c9a3-0000000001ef 26764 1726882732.32248: WORKER PROCESS EXITING changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "rpltstbr", "state": "up", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [003] #0, state:up persistent_state:present, 'rpltstbr': add connection rpltstbr, 19accc22-6d88-4131-bb4c-c8b05397a153 [004] #0, state:up persistent_state:present, 'rpltstbr': up connection rpltstbr, 19accc22-6d88-4131-bb4c-c8b05397a153 (not-active) 26764 1726882732.32346: no more pending results, returning what we have 26764 1726882732.32349: results queue empty 26764 1726882732.32350: checking for any_errors_fatal 26764 1726882732.32355: done checking for any_errors_fatal 26764 1726882732.32356: checking for max_fail_percentage 26764 1726882732.32357: done checking for max_fail_percentage 26764 1726882732.32358: checking to see if all hosts have failed and the running result is not ok 26764 1726882732.32359: done checking to see if all hosts have failed 26764 1726882732.32360: getting the remaining hosts for this loop 26764 1726882732.32361: done getting the remaining hosts for this loop 26764 1726882732.32367: getting the next task for host managed_node2 26764 1726882732.32374: done getting next task for host managed_node2 26764 1726882732.32377: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 26764 1726882732.32380: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882732.32389: getting variables 26764 1726882732.32391: in VariableManager get_vars() 26764 1726882732.32428: Calling all_inventory to load vars for managed_node2 26764 1726882732.32430: Calling groups_inventory to load vars for managed_node2 26764 1726882732.32432: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882732.32442: Calling all_plugins_play to load vars for managed_node2 26764 1726882732.32444: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882732.32447: Calling groups_plugins_play to load vars for managed_node2 26764 1726882732.33455: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882732.34915: done with get_vars() 26764 1726882732.34930: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:38:52 -0400 (0:00:00.638) 0:00:18.291 ****** 26764 1726882732.34993: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 26764 1726882732.35236: worker is 1 (out of 1 available) 26764 1726882732.35247: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 26764 1726882732.35260: done queuing things up, now waiting for results queue to drain 26764 1726882732.35261: waiting for pending results... 26764 1726882732.35549: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 26764 1726882732.35693: in run() - task 0e448fcc-3ce9-9875-c9a3-0000000001f0 26764 1726882732.35713: variable 'ansible_search_path' from source: unknown 26764 1726882732.35720: variable 'ansible_search_path' from source: unknown 26764 1726882732.35761: calling self._execute() 26764 1726882732.35857: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882732.35874: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882732.35891: variable 'omit' from source: magic vars 26764 1726882732.36399: variable 'ansible_distribution_major_version' from source: facts 26764 1726882732.36408: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882732.36514: variable 'network_state' from source: role '' defaults 26764 1726882732.36521: Evaluated conditional (network_state != {}): False 26764 1726882732.36524: when evaluation is False, skipping this task 26764 1726882732.36527: _execute() done 26764 1726882732.36529: dumping result to json 26764 1726882732.36536: done dumping result, returning 26764 1726882732.36539: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [0e448fcc-3ce9-9875-c9a3-0000000001f0] 26764 1726882732.36544: sending task result for task 0e448fcc-3ce9-9875-c9a3-0000000001f0 26764 1726882732.36644: done sending task result for task 0e448fcc-3ce9-9875-c9a3-0000000001f0 26764 1726882732.36647: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 26764 1726882732.36720: no more pending results, returning what we have 26764 1726882732.36723: results queue empty 26764 1726882732.36724: checking for any_errors_fatal 26764 1726882732.36735: done checking for any_errors_fatal 26764 1726882732.36735: checking for max_fail_percentage 26764 1726882732.36737: done checking for max_fail_percentage 26764 1726882732.36738: checking to see if all hosts have failed and the running result is not ok 26764 1726882732.36739: done checking to see if all hosts have failed 26764 1726882732.36739: getting the remaining hosts for this loop 26764 1726882732.36741: done getting the remaining hosts for this loop 26764 1726882732.36744: getting the next task for host managed_node2 26764 1726882732.36750: done getting next task for host managed_node2 26764 1726882732.36754: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 26764 1726882732.36756: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882732.36861: getting variables 26764 1726882732.36867: in VariableManager get_vars() 26764 1726882732.36902: Calling all_inventory to load vars for managed_node2 26764 1726882732.36905: Calling groups_inventory to load vars for managed_node2 26764 1726882732.36907: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882732.36916: Calling all_plugins_play to load vars for managed_node2 26764 1726882732.36918: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882732.36921: Calling groups_plugins_play to load vars for managed_node2 26764 1726882732.39076: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882732.40934: done with get_vars() 26764 1726882732.40966: done getting variables 26764 1726882732.41027: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:38:52 -0400 (0:00:00.060) 0:00:18.352 ****** 26764 1726882732.41068: entering _queue_task() for managed_node2/debug 26764 1726882732.41377: worker is 1 (out of 1 available) 26764 1726882732.41391: exiting _queue_task() for managed_node2/debug 26764 1726882732.41403: done queuing things up, now waiting for results queue to drain 26764 1726882732.41404: waiting for pending results... 26764 1726882732.41687: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 26764 1726882732.41818: in run() - task 0e448fcc-3ce9-9875-c9a3-0000000001f1 26764 1726882732.41840: variable 'ansible_search_path' from source: unknown 26764 1726882732.41851: variable 'ansible_search_path' from source: unknown 26764 1726882732.41897: calling self._execute() 26764 1726882732.42000: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882732.42011: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882732.42029: variable 'omit' from source: magic vars 26764 1726882732.42413: variable 'ansible_distribution_major_version' from source: facts 26764 1726882732.42429: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882732.42446: variable 'omit' from source: magic vars 26764 1726882732.42506: variable 'omit' from source: magic vars 26764 1726882732.42541: variable 'omit' from source: magic vars 26764 1726882732.42590: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26764 1726882732.42630: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26764 1726882732.42653: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26764 1726882732.42680: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882732.42703: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882732.42740: variable 'inventory_hostname' from source: host vars for 'managed_node2' 26764 1726882732.42751: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882732.42759: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882732.42873: Set connection var ansible_shell_executable to /bin/sh 26764 1726882732.42881: Set connection var ansible_shell_type to sh 26764 1726882732.42895: Set connection var ansible_module_compression to ZIP_DEFLATED 26764 1726882732.42908: Set connection var ansible_timeout to 10 26764 1726882732.42917: Set connection var ansible_connection to ssh 26764 1726882732.42926: Set connection var ansible_pipelining to False 26764 1726882732.42953: variable 'ansible_shell_executable' from source: unknown 26764 1726882732.42960: variable 'ansible_connection' from source: unknown 26764 1726882732.42971: variable 'ansible_module_compression' from source: unknown 26764 1726882732.42978: variable 'ansible_shell_type' from source: unknown 26764 1726882732.42983: variable 'ansible_shell_executable' from source: unknown 26764 1726882732.42989: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882732.42995: variable 'ansible_pipelining' from source: unknown 26764 1726882732.43001: variable 'ansible_timeout' from source: unknown 26764 1726882732.43007: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882732.43157: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 26764 1726882732.43178: variable 'omit' from source: magic vars 26764 1726882732.43189: starting attempt loop 26764 1726882732.43195: running the handler 26764 1726882732.43330: variable '__network_connections_result' from source: set_fact 26764 1726882732.43399: handler run complete 26764 1726882732.43420: attempt loop complete, returning result 26764 1726882732.43427: _execute() done 26764 1726882732.43433: dumping result to json 26764 1726882732.43439: done dumping result, returning 26764 1726882732.43455: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0e448fcc-3ce9-9875-c9a3-0000000001f1] 26764 1726882732.43468: sending task result for task 0e448fcc-3ce9-9875-c9a3-0000000001f1 26764 1726882732.43578: done sending task result for task 0e448fcc-3ce9-9875-c9a3-0000000001f1 ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "[003] #0, state:up persistent_state:present, 'rpltstbr': add connection rpltstbr, 19accc22-6d88-4131-bb4c-c8b05397a153", "[004] #0, state:up persistent_state:present, 'rpltstbr': up connection rpltstbr, 19accc22-6d88-4131-bb4c-c8b05397a153 (not-active)" ] } 26764 1726882732.43641: no more pending results, returning what we have 26764 1726882732.43644: results queue empty 26764 1726882732.43646: checking for any_errors_fatal 26764 1726882732.43651: done checking for any_errors_fatal 26764 1726882732.43652: checking for max_fail_percentage 26764 1726882732.43653: done checking for max_fail_percentage 26764 1726882732.43655: checking to see if all hosts have failed and the running result is not ok 26764 1726882732.43655: done checking to see if all hosts have failed 26764 1726882732.43656: getting the remaining hosts for this loop 26764 1726882732.43658: done getting the remaining hosts for this loop 26764 1726882732.43661: getting the next task for host managed_node2 26764 1726882732.43673: done getting next task for host managed_node2 26764 1726882732.43679: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 26764 1726882732.43682: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882732.43693: getting variables 26764 1726882732.43694: in VariableManager get_vars() 26764 1726882732.43735: Calling all_inventory to load vars for managed_node2 26764 1726882732.43737: Calling groups_inventory to load vars for managed_node2 26764 1726882732.43740: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882732.43750: Calling all_plugins_play to load vars for managed_node2 26764 1726882732.43753: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882732.43756: Calling groups_plugins_play to load vars for managed_node2 26764 1726882732.44805: WORKER PROCESS EXITING 26764 1726882732.45796: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882732.49476: done with get_vars() 26764 1726882732.49503: done getting variables 26764 1726882732.49685: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:38:52 -0400 (0:00:00.086) 0:00:18.438 ****** 26764 1726882732.49719: entering _queue_task() for managed_node2/debug 26764 1726882732.50592: worker is 1 (out of 1 available) 26764 1726882732.50671: exiting _queue_task() for managed_node2/debug 26764 1726882732.50684: done queuing things up, now waiting for results queue to drain 26764 1726882732.50686: waiting for pending results... 26764 1726882732.51612: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 26764 1726882732.51819: in run() - task 0e448fcc-3ce9-9875-c9a3-0000000001f2 26764 1726882732.51837: variable 'ansible_search_path' from source: unknown 26764 1726882732.51845: variable 'ansible_search_path' from source: unknown 26764 1726882732.51932: calling self._execute() 26764 1726882732.52143: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882732.52154: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882732.52171: variable 'omit' from source: magic vars 26764 1726882732.53005: variable 'ansible_distribution_major_version' from source: facts 26764 1726882732.53024: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882732.53036: variable 'omit' from source: magic vars 26764 1726882732.53089: variable 'omit' from source: magic vars 26764 1726882732.53131: variable 'omit' from source: magic vars 26764 1726882732.53255: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26764 1726882732.53299: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26764 1726882732.53453: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26764 1726882732.53480: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882732.53497: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882732.53529: variable 'inventory_hostname' from source: host vars for 'managed_node2' 26764 1726882732.53543: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882732.53551: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882732.53772: Set connection var ansible_shell_executable to /bin/sh 26764 1726882732.53775: Set connection var ansible_shell_type to sh 26764 1726882732.53788: Set connection var ansible_module_compression to ZIP_DEFLATED 26764 1726882732.53791: Set connection var ansible_timeout to 10 26764 1726882732.53795: Set connection var ansible_connection to ssh 26764 1726882732.53801: Set connection var ansible_pipelining to False 26764 1726882732.53823: variable 'ansible_shell_executable' from source: unknown 26764 1726882732.53826: variable 'ansible_connection' from source: unknown 26764 1726882732.53829: variable 'ansible_module_compression' from source: unknown 26764 1726882732.53831: variable 'ansible_shell_type' from source: unknown 26764 1726882732.53833: variable 'ansible_shell_executable' from source: unknown 26764 1726882732.53835: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882732.53840: variable 'ansible_pipelining' from source: unknown 26764 1726882732.53842: variable 'ansible_timeout' from source: unknown 26764 1726882732.53847: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882732.54212: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 26764 1726882732.54223: variable 'omit' from source: magic vars 26764 1726882732.54229: starting attempt loop 26764 1726882732.54232: running the handler 26764 1726882732.54419: variable '__network_connections_result' from source: set_fact 26764 1726882732.54491: variable '__network_connections_result' from source: set_fact 26764 1726882732.54721: handler run complete 26764 1726882732.54861: attempt loop complete, returning result 26764 1726882732.54869: _execute() done 26764 1726882732.54872: dumping result to json 26764 1726882732.54874: done dumping result, returning 26764 1726882732.54883: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0e448fcc-3ce9-9875-c9a3-0000000001f2] 26764 1726882732.54888: sending task result for task 0e448fcc-3ce9-9875-c9a3-0000000001f2 26764 1726882732.54995: done sending task result for task 0e448fcc-3ce9-9875-c9a3-0000000001f2 26764 1726882732.54999: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "auto6": false, "dhcp4": false }, "name": "rpltstbr", "state": "up", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[003] #0, state:up persistent_state:present, 'rpltstbr': add connection rpltstbr, 19accc22-6d88-4131-bb4c-c8b05397a153\n[004] #0, state:up persistent_state:present, 'rpltstbr': up connection rpltstbr, 19accc22-6d88-4131-bb4c-c8b05397a153 (not-active)\n", "stderr_lines": [ "[003] #0, state:up persistent_state:present, 'rpltstbr': add connection rpltstbr, 19accc22-6d88-4131-bb4c-c8b05397a153", "[004] #0, state:up persistent_state:present, 'rpltstbr': up connection rpltstbr, 19accc22-6d88-4131-bb4c-c8b05397a153 (not-active)" ] } } 26764 1726882732.55090: no more pending results, returning what we have 26764 1726882732.55093: results queue empty 26764 1726882732.55094: checking for any_errors_fatal 26764 1726882732.55100: done checking for any_errors_fatal 26764 1726882732.55100: checking for max_fail_percentage 26764 1726882732.55102: done checking for max_fail_percentage 26764 1726882732.55103: checking to see if all hosts have failed and the running result is not ok 26764 1726882732.55103: done checking to see if all hosts have failed 26764 1726882732.55104: getting the remaining hosts for this loop 26764 1726882732.55105: done getting the remaining hosts for this loop 26764 1726882732.55110: getting the next task for host managed_node2 26764 1726882732.55117: done getting next task for host managed_node2 26764 1726882732.55120: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 26764 1726882732.55123: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882732.55132: getting variables 26764 1726882732.55134: in VariableManager get_vars() 26764 1726882732.55176: Calling all_inventory to load vars for managed_node2 26764 1726882732.55179: Calling groups_inventory to load vars for managed_node2 26764 1726882732.55181: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882732.55193: Calling all_plugins_play to load vars for managed_node2 26764 1726882732.55202: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882732.55205: Calling groups_plugins_play to load vars for managed_node2 26764 1726882732.57322: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882732.59432: done with get_vars() 26764 1726882732.59453: done getting variables 26764 1726882732.59510: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:38:52 -0400 (0:00:00.098) 0:00:18.537 ****** 26764 1726882732.59541: entering _queue_task() for managed_node2/debug 26764 1726882732.59837: worker is 1 (out of 1 available) 26764 1726882732.59853: exiting _queue_task() for managed_node2/debug 26764 1726882732.59868: done queuing things up, now waiting for results queue to drain 26764 1726882732.59869: waiting for pending results... 26764 1726882732.60145: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 26764 1726882732.60261: in run() - task 0e448fcc-3ce9-9875-c9a3-0000000001f3 26764 1726882732.60288: variable 'ansible_search_path' from source: unknown 26764 1726882732.60295: variable 'ansible_search_path' from source: unknown 26764 1726882732.60337: calling self._execute() 26764 1726882732.60439: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882732.60452: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882732.60471: variable 'omit' from source: magic vars 26764 1726882732.61113: variable 'ansible_distribution_major_version' from source: facts 26764 1726882732.61140: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882732.61274: variable 'network_state' from source: role '' defaults 26764 1726882732.61289: Evaluated conditional (network_state != {}): False 26764 1726882732.61295: when evaluation is False, skipping this task 26764 1726882732.61301: _execute() done 26764 1726882732.61307: dumping result to json 26764 1726882732.61314: done dumping result, returning 26764 1726882732.61324: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0e448fcc-3ce9-9875-c9a3-0000000001f3] 26764 1726882732.61334: sending task result for task 0e448fcc-3ce9-9875-c9a3-0000000001f3 26764 1726882732.61445: done sending task result for task 0e448fcc-3ce9-9875-c9a3-0000000001f3 skipping: [managed_node2] => { "false_condition": "network_state != {}" } 26764 1726882732.61504: no more pending results, returning what we have 26764 1726882732.61508: results queue empty 26764 1726882732.61509: checking for any_errors_fatal 26764 1726882732.61517: done checking for any_errors_fatal 26764 1726882732.61518: checking for max_fail_percentage 26764 1726882732.61520: done checking for max_fail_percentage 26764 1726882732.61521: checking to see if all hosts have failed and the running result is not ok 26764 1726882732.61521: done checking to see if all hosts have failed 26764 1726882732.61522: getting the remaining hosts for this loop 26764 1726882732.61524: done getting the remaining hosts for this loop 26764 1726882732.61527: getting the next task for host managed_node2 26764 1726882732.61535: done getting next task for host managed_node2 26764 1726882732.61539: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 26764 1726882732.61543: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882732.61562: getting variables 26764 1726882732.61566: in VariableManager get_vars() 26764 1726882732.61611: Calling all_inventory to load vars for managed_node2 26764 1726882732.61614: Calling groups_inventory to load vars for managed_node2 26764 1726882732.61617: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882732.61630: Calling all_plugins_play to load vars for managed_node2 26764 1726882732.61634: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882732.61638: Calling groups_plugins_play to load vars for managed_node2 26764 1726882732.62750: WORKER PROCESS EXITING 26764 1726882732.63111: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882732.64227: done with get_vars() 26764 1726882732.64247: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:38:52 -0400 (0:00:00.048) 0:00:18.585 ****** 26764 1726882732.64352: entering _queue_task() for managed_node2/ping 26764 1726882732.64658: worker is 1 (out of 1 available) 26764 1726882732.64675: exiting _queue_task() for managed_node2/ping 26764 1726882732.64690: done queuing things up, now waiting for results queue to drain 26764 1726882732.64691: waiting for pending results... 26764 1726882732.65166: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 26764 1726882732.65171: in run() - task 0e448fcc-3ce9-9875-c9a3-0000000001f4 26764 1726882732.65175: variable 'ansible_search_path' from source: unknown 26764 1726882732.65178: variable 'ansible_search_path' from source: unknown 26764 1726882732.65180: calling self._execute() 26764 1726882732.65449: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882732.65452: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882732.65456: variable 'omit' from source: magic vars 26764 1726882732.65869: variable 'ansible_distribution_major_version' from source: facts 26764 1726882732.65872: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882732.65875: variable 'omit' from source: magic vars 26764 1726882732.65878: variable 'omit' from source: magic vars 26764 1726882732.65880: variable 'omit' from source: magic vars 26764 1726882732.65913: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26764 1726882732.65950: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26764 1726882732.65982: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26764 1726882732.65999: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882732.66017: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882732.66051: variable 'inventory_hostname' from source: host vars for 'managed_node2' 26764 1726882732.66054: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882732.66057: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882732.66170: Set connection var ansible_shell_executable to /bin/sh 26764 1726882732.66174: Set connection var ansible_shell_type to sh 26764 1726882732.66192: Set connection var ansible_module_compression to ZIP_DEFLATED 26764 1726882732.66196: Set connection var ansible_timeout to 10 26764 1726882732.66462: Set connection var ansible_connection to ssh 26764 1726882732.66467: Set connection var ansible_pipelining to False 26764 1726882732.66470: variable 'ansible_shell_executable' from source: unknown 26764 1726882732.66472: variable 'ansible_connection' from source: unknown 26764 1726882732.66475: variable 'ansible_module_compression' from source: unknown 26764 1726882732.66477: variable 'ansible_shell_type' from source: unknown 26764 1726882732.66479: variable 'ansible_shell_executable' from source: unknown 26764 1726882732.66481: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882732.66483: variable 'ansible_pipelining' from source: unknown 26764 1726882732.66485: variable 'ansible_timeout' from source: unknown 26764 1726882732.66487: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882732.66489: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 26764 1726882732.66493: variable 'omit' from source: magic vars 26764 1726882732.66495: starting attempt loop 26764 1726882732.66497: running the handler 26764 1726882732.66502: _low_level_execute_command(): starting 26764 1726882732.66516: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 26764 1726882732.67613: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882732.67622: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882732.67655: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882732.67659: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882732.67678: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882732.67685: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882732.67756: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882732.67759: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882732.67783: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882732.67899: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882732.69584: stdout chunk (state=3): >>>/root <<< 26764 1726882732.69804: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882732.69882: stderr chunk (state=3): >>><<< 26764 1726882732.69898: stdout chunk (state=3): >>><<< 26764 1726882732.70019: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882732.70022: _low_level_execute_command(): starting 26764 1726882732.70024: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882732.6992524-27568-15685650050690 `" && echo ansible-tmp-1726882732.6992524-27568-15685650050690="` echo /root/.ansible/tmp/ansible-tmp-1726882732.6992524-27568-15685650050690 `" ) && sleep 0' 26764 1726882732.71016: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26764 1726882732.71022: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882732.71030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882732.71040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882732.71090: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882732.71100: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882732.71107: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882732.71119: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882732.71122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882732.71132: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882732.71140: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882732.71145: stderr chunk (state=3): >>>debug2: match found <<< 26764 1726882732.71151: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882732.71261: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882732.71267: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882732.71270: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882732.71403: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882732.73271: stdout chunk (state=3): >>>ansible-tmp-1726882732.6992524-27568-15685650050690=/root/.ansible/tmp/ansible-tmp-1726882732.6992524-27568-15685650050690 <<< 26764 1726882732.73382: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882732.73422: stderr chunk (state=3): >>><<< 26764 1726882732.73425: stdout chunk (state=3): >>><<< 26764 1726882732.73437: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882732.6992524-27568-15685650050690=/root/.ansible/tmp/ansible-tmp-1726882732.6992524-27568-15685650050690 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882732.73476: variable 'ansible_module_compression' from source: unknown 26764 1726882732.73506: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-26764trh16hvb/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 26764 1726882732.73536: variable 'ansible_facts' from source: unknown 26764 1726882732.73591: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882732.6992524-27568-15685650050690/AnsiballZ_ping.py 26764 1726882732.73693: Sending initial data 26764 1726882732.73696: Sent initial data (152 bytes) 26764 1726882732.74645: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26764 1726882732.74649: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882732.74651: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882732.74699: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882732.74703: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 26764 1726882732.74705: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882732.74799: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882732.74810: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882732.74917: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882732.76639: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 26764 1726882732.76731: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 26764 1726882732.76829: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-26764trh16hvb/tmph9lwd75m /root/.ansible/tmp/ansible-tmp-1726882732.6992524-27568-15685650050690/AnsiballZ_ping.py <<< 26764 1726882732.76926: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 26764 1726882732.77915: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882732.77995: stderr chunk (state=3): >>><<< 26764 1726882732.77998: stdout chunk (state=3): >>><<< 26764 1726882732.78016: done transferring module to remote 26764 1726882732.78024: _low_level_execute_command(): starting 26764 1726882732.78028: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882732.6992524-27568-15685650050690/ /root/.ansible/tmp/ansible-tmp-1726882732.6992524-27568-15685650050690/AnsiballZ_ping.py && sleep 0' 26764 1726882732.78426: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882732.78431: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882732.78471: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882732.78485: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882732.78532: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882732.78536: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882732.78659: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882732.80393: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882732.80432: stderr chunk (state=3): >>><<< 26764 1726882732.80435: stdout chunk (state=3): >>><<< 26764 1726882732.80446: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882732.80449: _low_level_execute_command(): starting 26764 1726882732.80454: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882732.6992524-27568-15685650050690/AnsiballZ_ping.py && sleep 0' 26764 1726882732.80851: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882732.80857: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882732.80890: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882732.80902: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882732.80953: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882732.80969: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882732.81078: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882732.93997: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 26764 1726882732.95071: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 26764 1726882732.95075: stdout chunk (state=3): >>><<< 26764 1726882732.95078: stderr chunk (state=3): >>><<< 26764 1726882732.95219: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 26764 1726882732.95223: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882732.6992524-27568-15685650050690/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 26764 1726882732.95226: _low_level_execute_command(): starting 26764 1726882732.95229: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882732.6992524-27568-15685650050690/ > /dev/null 2>&1 && sleep 0' 26764 1726882732.95821: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26764 1726882732.95836: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882732.95850: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882732.95875: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882732.95918: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882732.95930: stderr chunk (state=3): >>>debug2: match not found <<< 26764 1726882732.95944: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882732.95960: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882732.95979: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882732.95996: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26764 1726882732.96011: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882732.96025: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882732.96040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882732.96052: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882732.96062: stderr chunk (state=3): >>>debug2: match found <<< 26764 1726882732.96081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882732.96158: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882732.96189: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882732.96212: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882732.96348: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882732.98200: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882732.98295: stderr chunk (state=3): >>><<< 26764 1726882732.98306: stdout chunk (state=3): >>><<< 26764 1726882732.98576: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882732.98580: handler run complete 26764 1726882732.98582: attempt loop complete, returning result 26764 1726882732.98584: _execute() done 26764 1726882732.98586: dumping result to json 26764 1726882732.98588: done dumping result, returning 26764 1726882732.98590: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0e448fcc-3ce9-9875-c9a3-0000000001f4] 26764 1726882732.98591: sending task result for task 0e448fcc-3ce9-9875-c9a3-0000000001f4 26764 1726882732.98658: done sending task result for task 0e448fcc-3ce9-9875-c9a3-0000000001f4 26764 1726882732.98661: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 26764 1726882732.98740: no more pending results, returning what we have 26764 1726882732.98744: results queue empty 26764 1726882732.98745: checking for any_errors_fatal 26764 1726882732.98752: done checking for any_errors_fatal 26764 1726882732.98752: checking for max_fail_percentage 26764 1726882732.98754: done checking for max_fail_percentage 26764 1726882732.98755: checking to see if all hosts have failed and the running result is not ok 26764 1726882732.98756: done checking to see if all hosts have failed 26764 1726882732.98757: getting the remaining hosts for this loop 26764 1726882732.98759: done getting the remaining hosts for this loop 26764 1726882732.98762: getting the next task for host managed_node2 26764 1726882732.98776: done getting next task for host managed_node2 26764 1726882732.98779: ^ task is: TASK: meta (role_complete) 26764 1726882732.98782: ^ state is: HOST STATE: block=5, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882732.98793: getting variables 26764 1726882732.98795: in VariableManager get_vars() 26764 1726882732.98840: Calling all_inventory to load vars for managed_node2 26764 1726882732.98843: Calling groups_inventory to load vars for managed_node2 26764 1726882732.98846: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882732.98857: Calling all_plugins_play to load vars for managed_node2 26764 1726882732.98861: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882732.98868: Calling groups_plugins_play to load vars for managed_node2 26764 1726882733.00759: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882733.02606: done with get_vars() 26764 1726882733.02627: done getting variables 26764 1726882733.02721: done queuing things up, now waiting for results queue to drain 26764 1726882733.02723: results queue empty 26764 1726882733.02724: checking for any_errors_fatal 26764 1726882733.02727: done checking for any_errors_fatal 26764 1726882733.02728: checking for max_fail_percentage 26764 1726882733.02729: done checking for max_fail_percentage 26764 1726882733.02729: checking to see if all hosts have failed and the running result is not ok 26764 1726882733.02730: done checking to see if all hosts have failed 26764 1726882733.02731: getting the remaining hosts for this loop 26764 1726882733.02732: done getting the remaining hosts for this loop 26764 1726882733.02734: getting the next task for host managed_node2 26764 1726882733.02738: done getting next task for host managed_node2 26764 1726882733.02740: ^ task is: TASK: Include the task 'assert_device_present.yml' 26764 1726882733.02741: ^ state is: HOST STATE: block=5, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882733.02743: getting variables 26764 1726882733.02744: in VariableManager get_vars() 26764 1726882733.02755: Calling all_inventory to load vars for managed_node2 26764 1726882733.02757: Calling groups_inventory to load vars for managed_node2 26764 1726882733.02759: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882733.02768: Calling all_plugins_play to load vars for managed_node2 26764 1726882733.02770: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882733.02773: Calling groups_plugins_play to load vars for managed_node2 26764 1726882733.04080: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882733.05914: done with get_vars() 26764 1726882733.05934: done getting variables TASK [Include the task 'assert_device_present.yml'] **************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_reapply.yml:38 Friday 20 September 2024 21:38:53 -0400 (0:00:00.416) 0:00:19.001 ****** 26764 1726882733.06008: entering _queue_task() for managed_node2/include_tasks 26764 1726882733.06340: worker is 1 (out of 1 available) 26764 1726882733.06352: exiting _queue_task() for managed_node2/include_tasks 26764 1726882733.06368: done queuing things up, now waiting for results queue to drain 26764 1726882733.06369: waiting for pending results... 26764 1726882733.06652: running TaskExecutor() for managed_node2/TASK: Include the task 'assert_device_present.yml' 26764 1726882733.06761: in run() - task 0e448fcc-3ce9-9875-c9a3-00000000002b 26764 1726882733.06785: variable 'ansible_search_path' from source: unknown 26764 1726882733.06833: calling self._execute() 26764 1726882733.06940: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882733.06951: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882733.06969: variable 'omit' from source: magic vars 26764 1726882733.07398: variable 'ansible_distribution_major_version' from source: facts 26764 1726882733.07414: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882733.07423: _execute() done 26764 1726882733.07430: dumping result to json 26764 1726882733.07437: done dumping result, returning 26764 1726882733.07445: done running TaskExecutor() for managed_node2/TASK: Include the task 'assert_device_present.yml' [0e448fcc-3ce9-9875-c9a3-00000000002b] 26764 1726882733.07453: sending task result for task 0e448fcc-3ce9-9875-c9a3-00000000002b 26764 1726882733.07569: done sending task result for task 0e448fcc-3ce9-9875-c9a3-00000000002b 26764 1726882733.07597: no more pending results, returning what we have 26764 1726882733.07602: in VariableManager get_vars() 26764 1726882733.07643: Calling all_inventory to load vars for managed_node2 26764 1726882733.07646: Calling groups_inventory to load vars for managed_node2 26764 1726882733.07648: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882733.07661: Calling all_plugins_play to load vars for managed_node2 26764 1726882733.07668: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882733.07672: Calling groups_plugins_play to load vars for managed_node2 26764 1726882733.08767: WORKER PROCESS EXITING 26764 1726882733.09592: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882733.10750: done with get_vars() 26764 1726882733.10763: variable 'ansible_search_path' from source: unknown 26764 1726882733.10776: we have included files to process 26764 1726882733.10777: generating all_blocks data 26764 1726882733.10778: done generating all_blocks data 26764 1726882733.10781: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 26764 1726882733.10782: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 26764 1726882733.10783: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 26764 1726882733.10894: in VariableManager get_vars() 26764 1726882733.10908: done with get_vars() 26764 1726882733.10984: done processing included file 26764 1726882733.10986: iterating over new_blocks loaded from include file 26764 1726882733.10987: in VariableManager get_vars() 26764 1726882733.10996: done with get_vars() 26764 1726882733.10997: filtering new block on tags 26764 1726882733.11010: done filtering new block on tags 26764 1726882733.11012: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed_node2 26764 1726882733.11015: extending task lists for all hosts with included blocks 26764 1726882733.11648: done extending task lists 26764 1726882733.11649: done processing included files 26764 1726882733.11650: results queue empty 26764 1726882733.11650: checking for any_errors_fatal 26764 1726882733.11651: done checking for any_errors_fatal 26764 1726882733.11652: checking for max_fail_percentage 26764 1726882733.11652: done checking for max_fail_percentage 26764 1726882733.11653: checking to see if all hosts have failed and the running result is not ok 26764 1726882733.11653: done checking to see if all hosts have failed 26764 1726882733.11654: getting the remaining hosts for this loop 26764 1726882733.11654: done getting the remaining hosts for this loop 26764 1726882733.11656: getting the next task for host managed_node2 26764 1726882733.11660: done getting next task for host managed_node2 26764 1726882733.11661: ^ task is: TASK: Include the task 'get_interface_stat.yml' 26764 1726882733.11663: ^ state is: HOST STATE: block=5, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882733.11667: getting variables 26764 1726882733.11668: in VariableManager get_vars() 26764 1726882733.11678: Calling all_inventory to load vars for managed_node2 26764 1726882733.11680: Calling groups_inventory to load vars for managed_node2 26764 1726882733.11681: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882733.11685: Calling all_plugins_play to load vars for managed_node2 26764 1726882733.11686: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882733.11688: Calling groups_plugins_play to load vars for managed_node2 26764 1726882733.12451: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882733.14028: done with get_vars() 26764 1726882733.14048: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 21:38:53 -0400 (0:00:00.081) 0:00:19.082 ****** 26764 1726882733.14119: entering _queue_task() for managed_node2/include_tasks 26764 1726882733.14401: worker is 1 (out of 1 available) 26764 1726882733.14412: exiting _queue_task() for managed_node2/include_tasks 26764 1726882733.14425: done queuing things up, now waiting for results queue to drain 26764 1726882733.14426: waiting for pending results... 26764 1726882733.14688: running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' 26764 1726882733.14756: in run() - task 0e448fcc-3ce9-9875-c9a3-0000000002fc 26764 1726882733.14773: variable 'ansible_search_path' from source: unknown 26764 1726882733.14776: variable 'ansible_search_path' from source: unknown 26764 1726882733.14803: calling self._execute() 26764 1726882733.14876: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882733.14881: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882733.14890: variable 'omit' from source: magic vars 26764 1726882733.15152: variable 'ansible_distribution_major_version' from source: facts 26764 1726882733.15161: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882733.15171: _execute() done 26764 1726882733.15174: dumping result to json 26764 1726882733.15177: done dumping result, returning 26764 1726882733.15184: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' [0e448fcc-3ce9-9875-c9a3-0000000002fc] 26764 1726882733.15189: sending task result for task 0e448fcc-3ce9-9875-c9a3-0000000002fc 26764 1726882733.15271: done sending task result for task 0e448fcc-3ce9-9875-c9a3-0000000002fc 26764 1726882733.15273: WORKER PROCESS EXITING 26764 1726882733.15330: no more pending results, returning what we have 26764 1726882733.15334: in VariableManager get_vars() 26764 1726882733.15369: Calling all_inventory to load vars for managed_node2 26764 1726882733.15371: Calling groups_inventory to load vars for managed_node2 26764 1726882733.15373: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882733.15383: Calling all_plugins_play to load vars for managed_node2 26764 1726882733.15386: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882733.15388: Calling groups_plugins_play to load vars for managed_node2 26764 1726882733.16148: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882733.17518: done with get_vars() 26764 1726882733.17531: variable 'ansible_search_path' from source: unknown 26764 1726882733.17532: variable 'ansible_search_path' from source: unknown 26764 1726882733.17556: we have included files to process 26764 1726882733.17557: generating all_blocks data 26764 1726882733.17558: done generating all_blocks data 26764 1726882733.17559: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 26764 1726882733.17560: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 26764 1726882733.17562: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 26764 1726882733.17692: done processing included file 26764 1726882733.17694: iterating over new_blocks loaded from include file 26764 1726882733.17695: in VariableManager get_vars() 26764 1726882733.17705: done with get_vars() 26764 1726882733.17707: filtering new block on tags 26764 1726882733.17718: done filtering new block on tags 26764 1726882733.17720: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node2 26764 1726882733.17723: extending task lists for all hosts with included blocks 26764 1726882733.17787: done extending task lists 26764 1726882733.17788: done processing included files 26764 1726882733.17789: results queue empty 26764 1726882733.17789: checking for any_errors_fatal 26764 1726882733.17791: done checking for any_errors_fatal 26764 1726882733.17792: checking for max_fail_percentage 26764 1726882733.17792: done checking for max_fail_percentage 26764 1726882733.17793: checking to see if all hosts have failed and the running result is not ok 26764 1726882733.17793: done checking to see if all hosts have failed 26764 1726882733.17794: getting the remaining hosts for this loop 26764 1726882733.17794: done getting the remaining hosts for this loop 26764 1726882733.17796: getting the next task for host managed_node2 26764 1726882733.17799: done getting next task for host managed_node2 26764 1726882733.17800: ^ task is: TASK: Get stat for interface {{ interface }} 26764 1726882733.17802: ^ state is: HOST STATE: block=5, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882733.17803: getting variables 26764 1726882733.17804: in VariableManager get_vars() 26764 1726882733.17812: Calling all_inventory to load vars for managed_node2 26764 1726882733.17813: Calling groups_inventory to load vars for managed_node2 26764 1726882733.17814: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882733.17819: Calling all_plugins_play to load vars for managed_node2 26764 1726882733.17821: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882733.17823: Calling groups_plugins_play to load vars for managed_node2 26764 1726882733.18523: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882733.19425: done with get_vars() 26764 1726882733.19440: done getting variables 26764 1726882733.19557: variable 'interface' from source: play vars TASK [Get stat for interface rpltstbr] ***************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:38:53 -0400 (0:00:00.054) 0:00:19.137 ****** 26764 1726882733.19583: entering _queue_task() for managed_node2/stat 26764 1726882733.19793: worker is 1 (out of 1 available) 26764 1726882733.19806: exiting _queue_task() for managed_node2/stat 26764 1726882733.19820: done queuing things up, now waiting for results queue to drain 26764 1726882733.19821: waiting for pending results... 26764 1726882733.19994: running TaskExecutor() for managed_node2/TASK: Get stat for interface rpltstbr 26764 1726882733.20073: in run() - task 0e448fcc-3ce9-9875-c9a3-00000000037e 26764 1726882733.20086: variable 'ansible_search_path' from source: unknown 26764 1726882733.20090: variable 'ansible_search_path' from source: unknown 26764 1726882733.20121: calling self._execute() 26764 1726882733.20194: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882733.20198: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882733.20208: variable 'omit' from source: magic vars 26764 1726882733.20468: variable 'ansible_distribution_major_version' from source: facts 26764 1726882733.20480: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882733.20487: variable 'omit' from source: magic vars 26764 1726882733.20517: variable 'omit' from source: magic vars 26764 1726882733.20586: variable 'interface' from source: play vars 26764 1726882733.20598: variable 'omit' from source: magic vars 26764 1726882733.20630: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26764 1726882733.20658: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26764 1726882733.20679: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26764 1726882733.20693: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882733.20704: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882733.20727: variable 'inventory_hostname' from source: host vars for 'managed_node2' 26764 1726882733.20730: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882733.20732: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882733.20806: Set connection var ansible_shell_executable to /bin/sh 26764 1726882733.20809: Set connection var ansible_shell_type to sh 26764 1726882733.20816: Set connection var ansible_module_compression to ZIP_DEFLATED 26764 1726882733.20821: Set connection var ansible_timeout to 10 26764 1726882733.20826: Set connection var ansible_connection to ssh 26764 1726882733.20831: Set connection var ansible_pipelining to False 26764 1726882733.20847: variable 'ansible_shell_executable' from source: unknown 26764 1726882733.20850: variable 'ansible_connection' from source: unknown 26764 1726882733.20855: variable 'ansible_module_compression' from source: unknown 26764 1726882733.20857: variable 'ansible_shell_type' from source: unknown 26764 1726882733.20860: variable 'ansible_shell_executable' from source: unknown 26764 1726882733.20862: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882733.20865: variable 'ansible_pipelining' from source: unknown 26764 1726882733.20872: variable 'ansible_timeout' from source: unknown 26764 1726882733.20876: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882733.21016: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 26764 1726882733.21027: variable 'omit' from source: magic vars 26764 1726882733.21033: starting attempt loop 26764 1726882733.21036: running the handler 26764 1726882733.21047: _low_level_execute_command(): starting 26764 1726882733.21054: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 26764 1726882733.21572: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882733.21576: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882733.21607: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882733.21611: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882733.21613: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882733.21668: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882733.21672: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882733.21788: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882733.23454: stdout chunk (state=3): >>>/root <<< 26764 1726882733.23559: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882733.23609: stderr chunk (state=3): >>><<< 26764 1726882733.23614: stdout chunk (state=3): >>><<< 26764 1726882733.23636: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882733.23645: _low_level_execute_command(): starting 26764 1726882733.23650: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882733.236335-27591-16568602128928 `" && echo ansible-tmp-1726882733.236335-27591-16568602128928="` echo /root/.ansible/tmp/ansible-tmp-1726882733.236335-27591-16568602128928 `" ) && sleep 0' 26764 1726882733.24088: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882733.24094: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882733.24146: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882733.24156: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882733.24159: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882733.24161: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882733.24202: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882733.24207: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882733.24331: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882733.26207: stdout chunk (state=3): >>>ansible-tmp-1726882733.236335-27591-16568602128928=/root/.ansible/tmp/ansible-tmp-1726882733.236335-27591-16568602128928 <<< 26764 1726882733.26314: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882733.26356: stderr chunk (state=3): >>><<< 26764 1726882733.26367: stdout chunk (state=3): >>><<< 26764 1726882733.26384: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882733.236335-27591-16568602128928=/root/.ansible/tmp/ansible-tmp-1726882733.236335-27591-16568602128928 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882733.26415: variable 'ansible_module_compression' from source: unknown 26764 1726882733.26455: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-26764trh16hvb/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 26764 1726882733.26487: variable 'ansible_facts' from source: unknown 26764 1726882733.26541: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882733.236335-27591-16568602128928/AnsiballZ_stat.py 26764 1726882733.26637: Sending initial data 26764 1726882733.26641: Sent initial data (151 bytes) 26764 1726882733.27284: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882733.27290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882733.27320: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882733.27332: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882733.27386: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882733.27403: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882733.27500: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882733.29230: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 26764 1726882733.29324: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 26764 1726882733.29420: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-26764trh16hvb/tmpyeslbrml /root/.ansible/tmp/ansible-tmp-1726882733.236335-27591-16568602128928/AnsiballZ_stat.py <<< 26764 1726882733.29516: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 26764 1726882733.30524: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882733.30612: stderr chunk (state=3): >>><<< 26764 1726882733.30616: stdout chunk (state=3): >>><<< 26764 1726882733.30630: done transferring module to remote 26764 1726882733.30638: _low_level_execute_command(): starting 26764 1726882733.30642: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882733.236335-27591-16568602128928/ /root/.ansible/tmp/ansible-tmp-1726882733.236335-27591-16568602128928/AnsiballZ_stat.py && sleep 0' 26764 1726882733.31055: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882733.31061: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882733.31094: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882733.31106: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882733.31166: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882733.31170: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882733.31284: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882733.33010: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882733.33056: stderr chunk (state=3): >>><<< 26764 1726882733.33059: stdout chunk (state=3): >>><<< 26764 1726882733.33075: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882733.33078: _low_level_execute_command(): starting 26764 1726882733.33081: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882733.236335-27591-16568602128928/AnsiballZ_stat.py && sleep 0' 26764 1726882733.33489: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882733.33495: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882733.33520: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882733.33531: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882733.33582: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882733.33594: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882733.33714: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882733.46806: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/rpltstbr", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 31184, "dev": 21, "nlink": 1, "atime": 1726882732.2192006, "mtime": 1726882732.2192006, "ctime": 1726882732.2192006, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/rpltstbr", "lnk_target": "../../devices/virtual/net/rpltstbr", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/rpltstbr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 26764 1726882733.47741: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 26764 1726882733.47796: stderr chunk (state=3): >>><<< 26764 1726882733.47800: stdout chunk (state=3): >>><<< 26764 1726882733.47818: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/rpltstbr", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 31184, "dev": 21, "nlink": 1, "atime": 1726882732.2192006, "mtime": 1726882732.2192006, "ctime": 1726882732.2192006, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/rpltstbr", "lnk_target": "../../devices/virtual/net/rpltstbr", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/rpltstbr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 26764 1726882733.47855: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/rpltstbr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882733.236335-27591-16568602128928/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 26764 1726882733.47862: _low_level_execute_command(): starting 26764 1726882733.47869: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882733.236335-27591-16568602128928/ > /dev/null 2>&1 && sleep 0' 26764 1726882733.48318: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882733.48330: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882733.48362: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 26764 1726882733.48376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882733.48387: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882733.48438: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882733.48450: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882733.48558: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882733.50372: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882733.50416: stderr chunk (state=3): >>><<< 26764 1726882733.50419: stdout chunk (state=3): >>><<< 26764 1726882733.50431: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882733.50437: handler run complete 26764 1726882733.50471: attempt loop complete, returning result 26764 1726882733.50476: _execute() done 26764 1726882733.50478: dumping result to json 26764 1726882733.50482: done dumping result, returning 26764 1726882733.50495: done running TaskExecutor() for managed_node2/TASK: Get stat for interface rpltstbr [0e448fcc-3ce9-9875-c9a3-00000000037e] 26764 1726882733.50498: sending task result for task 0e448fcc-3ce9-9875-c9a3-00000000037e 26764 1726882733.50602: done sending task result for task 0e448fcc-3ce9-9875-c9a3-00000000037e 26764 1726882733.50605: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "atime": 1726882732.2192006, "block_size": 4096, "blocks": 0, "ctime": 1726882732.2192006, "dev": 21, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 31184, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/rpltstbr", "lnk_target": "../../devices/virtual/net/rpltstbr", "mode": "0777", "mtime": 1726882732.2192006, "nlink": 1, "path": "/sys/class/net/rpltstbr", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 26764 1726882733.50695: no more pending results, returning what we have 26764 1726882733.50699: results queue empty 26764 1726882733.50700: checking for any_errors_fatal 26764 1726882733.50702: done checking for any_errors_fatal 26764 1726882733.50703: checking for max_fail_percentage 26764 1726882733.50704: done checking for max_fail_percentage 26764 1726882733.50705: checking to see if all hosts have failed and the running result is not ok 26764 1726882733.50706: done checking to see if all hosts have failed 26764 1726882733.50707: getting the remaining hosts for this loop 26764 1726882733.50708: done getting the remaining hosts for this loop 26764 1726882733.50712: getting the next task for host managed_node2 26764 1726882733.50720: done getting next task for host managed_node2 26764 1726882733.50723: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 26764 1726882733.50725: ^ state is: HOST STATE: block=5, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882733.50728: getting variables 26764 1726882733.50730: in VariableManager get_vars() 26764 1726882733.50770: Calling all_inventory to load vars for managed_node2 26764 1726882733.50773: Calling groups_inventory to load vars for managed_node2 26764 1726882733.50775: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882733.50785: Calling all_plugins_play to load vars for managed_node2 26764 1726882733.50788: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882733.50790: Calling groups_plugins_play to load vars for managed_node2 26764 1726882733.51969: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882733.53770: done with get_vars() 26764 1726882733.53792: done getting variables 26764 1726882733.53852: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 26764 1726882733.53978: variable 'interface' from source: play vars TASK [Assert that the interface is present - 'rpltstbr'] *********************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 21:38:53 -0400 (0:00:00.344) 0:00:19.481 ****** 26764 1726882733.54007: entering _queue_task() for managed_node2/assert 26764 1726882733.54229: worker is 1 (out of 1 available) 26764 1726882733.54241: exiting _queue_task() for managed_node2/assert 26764 1726882733.54253: done queuing things up, now waiting for results queue to drain 26764 1726882733.54254: waiting for pending results... 26764 1726882733.54426: running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'rpltstbr' 26764 1726882733.54495: in run() - task 0e448fcc-3ce9-9875-c9a3-0000000002fd 26764 1726882733.54506: variable 'ansible_search_path' from source: unknown 26764 1726882733.54510: variable 'ansible_search_path' from source: unknown 26764 1726882733.54538: calling self._execute() 26764 1726882733.54616: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882733.54620: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882733.54635: variable 'omit' from source: magic vars 26764 1726882733.54893: variable 'ansible_distribution_major_version' from source: facts 26764 1726882733.54904: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882733.54911: variable 'omit' from source: magic vars 26764 1726882733.54936: variable 'omit' from source: magic vars 26764 1726882733.55001: variable 'interface' from source: play vars 26764 1726882733.55015: variable 'omit' from source: magic vars 26764 1726882733.55049: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26764 1726882733.55078: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26764 1726882733.55095: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26764 1726882733.55108: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882733.55117: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882733.55140: variable 'inventory_hostname' from source: host vars for 'managed_node2' 26764 1726882733.55143: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882733.55146: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882733.55219: Set connection var ansible_shell_executable to /bin/sh 26764 1726882733.55223: Set connection var ansible_shell_type to sh 26764 1726882733.55230: Set connection var ansible_module_compression to ZIP_DEFLATED 26764 1726882733.55236: Set connection var ansible_timeout to 10 26764 1726882733.55239: Set connection var ansible_connection to ssh 26764 1726882733.55245: Set connection var ansible_pipelining to False 26764 1726882733.55262: variable 'ansible_shell_executable' from source: unknown 26764 1726882733.55269: variable 'ansible_connection' from source: unknown 26764 1726882733.55272: variable 'ansible_module_compression' from source: unknown 26764 1726882733.55275: variable 'ansible_shell_type' from source: unknown 26764 1726882733.55278: variable 'ansible_shell_executable' from source: unknown 26764 1726882733.55281: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882733.55283: variable 'ansible_pipelining' from source: unknown 26764 1726882733.55285: variable 'ansible_timeout' from source: unknown 26764 1726882733.55287: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882733.55383: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 26764 1726882733.55392: variable 'omit' from source: magic vars 26764 1726882733.55398: starting attempt loop 26764 1726882733.55401: running the handler 26764 1726882733.55489: variable 'interface_stat' from source: set_fact 26764 1726882733.55503: Evaluated conditional (interface_stat.stat.exists): True 26764 1726882733.55508: handler run complete 26764 1726882733.55524: attempt loop complete, returning result 26764 1726882733.55527: _execute() done 26764 1726882733.55529: dumping result to json 26764 1726882733.55532: done dumping result, returning 26764 1726882733.55538: done running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'rpltstbr' [0e448fcc-3ce9-9875-c9a3-0000000002fd] 26764 1726882733.55543: sending task result for task 0e448fcc-3ce9-9875-c9a3-0000000002fd 26764 1726882733.55628: done sending task result for task 0e448fcc-3ce9-9875-c9a3-0000000002fd 26764 1726882733.55631: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 26764 1726882733.55678: no more pending results, returning what we have 26764 1726882733.55681: results queue empty 26764 1726882733.55683: checking for any_errors_fatal 26764 1726882733.55689: done checking for any_errors_fatal 26764 1726882733.55690: checking for max_fail_percentage 26764 1726882733.55691: done checking for max_fail_percentage 26764 1726882733.55692: checking to see if all hosts have failed and the running result is not ok 26764 1726882733.55692: done checking to see if all hosts have failed 26764 1726882733.55693: getting the remaining hosts for this loop 26764 1726882733.55695: done getting the remaining hosts for this loop 26764 1726882733.55698: getting the next task for host managed_node2 26764 1726882733.55704: done getting next task for host managed_node2 26764 1726882733.55707: ^ task is: TASK: Include the task 'assert_profile_present.yml' 26764 1726882733.55709: ^ state is: HOST STATE: block=5, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882733.55712: getting variables 26764 1726882733.55713: in VariableManager get_vars() 26764 1726882733.55749: Calling all_inventory to load vars for managed_node2 26764 1726882733.55752: Calling groups_inventory to load vars for managed_node2 26764 1726882733.55754: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882733.55762: Calling all_plugins_play to load vars for managed_node2 26764 1726882733.55768: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882733.55772: Calling groups_plugins_play to load vars for managed_node2 26764 1726882733.56525: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882733.57446: done with get_vars() 26764 1726882733.57460: done getting variables TASK [Include the task 'assert_profile_present.yml'] *************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_reapply.yml:40 Friday 20 September 2024 21:38:53 -0400 (0:00:00.035) 0:00:19.516 ****** 26764 1726882733.57523: entering _queue_task() for managed_node2/include_tasks 26764 1726882733.57702: worker is 1 (out of 1 available) 26764 1726882733.57714: exiting _queue_task() for managed_node2/include_tasks 26764 1726882733.57726: done queuing things up, now waiting for results queue to drain 26764 1726882733.57727: waiting for pending results... 26764 1726882733.57883: running TaskExecutor() for managed_node2/TASK: Include the task 'assert_profile_present.yml' 26764 1726882733.57939: in run() - task 0e448fcc-3ce9-9875-c9a3-00000000002c 26764 1726882733.57949: variable 'ansible_search_path' from source: unknown 26764 1726882733.57979: calling self._execute() 26764 1726882733.58042: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882733.58045: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882733.58054: variable 'omit' from source: magic vars 26764 1726882733.58298: variable 'ansible_distribution_major_version' from source: facts 26764 1726882733.58308: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882733.58314: _execute() done 26764 1726882733.58317: dumping result to json 26764 1726882733.58321: done dumping result, returning 26764 1726882733.58327: done running TaskExecutor() for managed_node2/TASK: Include the task 'assert_profile_present.yml' [0e448fcc-3ce9-9875-c9a3-00000000002c] 26764 1726882733.58333: sending task result for task 0e448fcc-3ce9-9875-c9a3-00000000002c 26764 1726882733.58416: done sending task result for task 0e448fcc-3ce9-9875-c9a3-00000000002c 26764 1726882733.58419: WORKER PROCESS EXITING 26764 1726882733.58451: no more pending results, returning what we have 26764 1726882733.58456: in VariableManager get_vars() 26764 1726882733.58493: Calling all_inventory to load vars for managed_node2 26764 1726882733.58496: Calling groups_inventory to load vars for managed_node2 26764 1726882733.58498: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882733.58507: Calling all_plugins_play to load vars for managed_node2 26764 1726882733.58510: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882733.58512: Calling groups_plugins_play to load vars for managed_node2 26764 1726882733.59368: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882733.63061: done with get_vars() 26764 1726882733.63077: variable 'ansible_search_path' from source: unknown 26764 1726882733.63086: we have included files to process 26764 1726882733.63087: generating all_blocks data 26764 1726882733.63088: done generating all_blocks data 26764 1726882733.63089: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 26764 1726882733.63090: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 26764 1726882733.63091: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 26764 1726882733.63198: in VariableManager get_vars() 26764 1726882733.63210: done with get_vars() 26764 1726882733.63367: done processing included file 26764 1726882733.63368: iterating over new_blocks loaded from include file 26764 1726882733.63369: in VariableManager get_vars() 26764 1726882733.63379: done with get_vars() 26764 1726882733.63380: filtering new block on tags 26764 1726882733.63391: done filtering new block on tags 26764 1726882733.63393: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node2 26764 1726882733.63395: extending task lists for all hosts with included blocks 26764 1726882733.64098: done extending task lists 26764 1726882733.64099: done processing included files 26764 1726882733.64100: results queue empty 26764 1726882733.64100: checking for any_errors_fatal 26764 1726882733.64101: done checking for any_errors_fatal 26764 1726882733.64102: checking for max_fail_percentage 26764 1726882733.64103: done checking for max_fail_percentage 26764 1726882733.64103: checking to see if all hosts have failed and the running result is not ok 26764 1726882733.64104: done checking to see if all hosts have failed 26764 1726882733.64104: getting the remaining hosts for this loop 26764 1726882733.64105: done getting the remaining hosts for this loop 26764 1726882733.64106: getting the next task for host managed_node2 26764 1726882733.64108: done getting next task for host managed_node2 26764 1726882733.64110: ^ task is: TASK: Include the task 'get_profile_stat.yml' 26764 1726882733.64111: ^ state is: HOST STATE: block=5, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882733.64112: getting variables 26764 1726882733.64113: in VariableManager get_vars() 26764 1726882733.64120: Calling all_inventory to load vars for managed_node2 26764 1726882733.64121: Calling groups_inventory to load vars for managed_node2 26764 1726882733.64122: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882733.64126: Calling all_plugins_play to load vars for managed_node2 26764 1726882733.64127: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882733.64129: Calling groups_plugins_play to load vars for managed_node2 26764 1726882733.64759: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882733.65656: done with get_vars() 26764 1726882733.65674: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 21:38:53 -0400 (0:00:00.081) 0:00:19.598 ****** 26764 1726882733.65718: entering _queue_task() for managed_node2/include_tasks 26764 1726882733.65946: worker is 1 (out of 1 available) 26764 1726882733.65959: exiting _queue_task() for managed_node2/include_tasks 26764 1726882733.65973: done queuing things up, now waiting for results queue to drain 26764 1726882733.65974: waiting for pending results... 26764 1726882733.66146: running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' 26764 1726882733.66218: in run() - task 0e448fcc-3ce9-9875-c9a3-000000000396 26764 1726882733.66228: variable 'ansible_search_path' from source: unknown 26764 1726882733.66231: variable 'ansible_search_path' from source: unknown 26764 1726882733.66263: calling self._execute() 26764 1726882733.66333: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882733.66336: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882733.66346: variable 'omit' from source: magic vars 26764 1726882733.66614: variable 'ansible_distribution_major_version' from source: facts 26764 1726882733.66623: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882733.66631: _execute() done 26764 1726882733.66634: dumping result to json 26764 1726882733.66637: done dumping result, returning 26764 1726882733.66641: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' [0e448fcc-3ce9-9875-c9a3-000000000396] 26764 1726882733.66648: sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000396 26764 1726882733.66736: done sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000396 26764 1726882733.66738: WORKER PROCESS EXITING 26764 1726882733.66777: no more pending results, returning what we have 26764 1726882733.66782: in VariableManager get_vars() 26764 1726882733.66822: Calling all_inventory to load vars for managed_node2 26764 1726882733.66825: Calling groups_inventory to load vars for managed_node2 26764 1726882733.66827: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882733.66840: Calling all_plugins_play to load vars for managed_node2 26764 1726882733.66843: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882733.66846: Calling groups_plugins_play to load vars for managed_node2 26764 1726882733.67711: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882733.68638: done with get_vars() 26764 1726882733.68649: variable 'ansible_search_path' from source: unknown 26764 1726882733.68650: variable 'ansible_search_path' from source: unknown 26764 1726882733.68679: we have included files to process 26764 1726882733.68680: generating all_blocks data 26764 1726882733.68681: done generating all_blocks data 26764 1726882733.68682: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 26764 1726882733.68683: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 26764 1726882733.68684: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 26764 1726882733.69345: done processing included file 26764 1726882733.69347: iterating over new_blocks loaded from include file 26764 1726882733.69348: in VariableManager get_vars() 26764 1726882733.69361: done with get_vars() 26764 1726882733.69362: filtering new block on tags 26764 1726882733.69379: done filtering new block on tags 26764 1726882733.69381: in VariableManager get_vars() 26764 1726882733.69389: done with get_vars() 26764 1726882733.69390: filtering new block on tags 26764 1726882733.69402: done filtering new block on tags 26764 1726882733.69403: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node2 26764 1726882733.69406: extending task lists for all hosts with included blocks 26764 1726882733.69509: done extending task lists 26764 1726882733.69510: done processing included files 26764 1726882733.69511: results queue empty 26764 1726882733.69511: checking for any_errors_fatal 26764 1726882733.69514: done checking for any_errors_fatal 26764 1726882733.69514: checking for max_fail_percentage 26764 1726882733.69515: done checking for max_fail_percentage 26764 1726882733.69515: checking to see if all hosts have failed and the running result is not ok 26764 1726882733.69516: done checking to see if all hosts have failed 26764 1726882733.69516: getting the remaining hosts for this loop 26764 1726882733.69517: done getting the remaining hosts for this loop 26764 1726882733.69519: getting the next task for host managed_node2 26764 1726882733.69521: done getting next task for host managed_node2 26764 1726882733.69523: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 26764 1726882733.69525: ^ state is: HOST STATE: block=5, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882733.69526: getting variables 26764 1726882733.69527: in VariableManager get_vars() 26764 1726882733.69562: Calling all_inventory to load vars for managed_node2 26764 1726882733.69566: Calling groups_inventory to load vars for managed_node2 26764 1726882733.69568: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882733.69572: Calling all_plugins_play to load vars for managed_node2 26764 1726882733.69574: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882733.69576: Calling groups_plugins_play to load vars for managed_node2 26764 1726882733.70205: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882733.71152: done with get_vars() 26764 1726882733.71168: done getting variables 26764 1726882733.71194: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 21:38:53 -0400 (0:00:00.054) 0:00:19.653 ****** 26764 1726882733.71215: entering _queue_task() for managed_node2/set_fact 26764 1726882733.71418: worker is 1 (out of 1 available) 26764 1726882733.71431: exiting _queue_task() for managed_node2/set_fact 26764 1726882733.71444: done queuing things up, now waiting for results queue to drain 26764 1726882733.71445: waiting for pending results... 26764 1726882733.71614: running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag 26764 1726882733.71682: in run() - task 0e448fcc-3ce9-9875-c9a3-000000000425 26764 1726882733.71692: variable 'ansible_search_path' from source: unknown 26764 1726882733.71695: variable 'ansible_search_path' from source: unknown 26764 1726882733.71723: calling self._execute() 26764 1726882733.71796: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882733.71799: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882733.71808: variable 'omit' from source: magic vars 26764 1726882733.72085: variable 'ansible_distribution_major_version' from source: facts 26764 1726882733.72094: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882733.72102: variable 'omit' from source: magic vars 26764 1726882733.72129: variable 'omit' from source: magic vars 26764 1726882733.72150: variable 'omit' from source: magic vars 26764 1726882733.72186: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26764 1726882733.72213: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26764 1726882733.72228: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26764 1726882733.72242: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882733.72253: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882733.72280: variable 'inventory_hostname' from source: host vars for 'managed_node2' 26764 1726882733.72285: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882733.72288: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882733.72354: Set connection var ansible_shell_executable to /bin/sh 26764 1726882733.72357: Set connection var ansible_shell_type to sh 26764 1726882733.72366: Set connection var ansible_module_compression to ZIP_DEFLATED 26764 1726882733.72373: Set connection var ansible_timeout to 10 26764 1726882733.72378: Set connection var ansible_connection to ssh 26764 1726882733.72384: Set connection var ansible_pipelining to False 26764 1726882733.72404: variable 'ansible_shell_executable' from source: unknown 26764 1726882733.72407: variable 'ansible_connection' from source: unknown 26764 1726882733.72410: variable 'ansible_module_compression' from source: unknown 26764 1726882733.72412: variable 'ansible_shell_type' from source: unknown 26764 1726882733.72415: variable 'ansible_shell_executable' from source: unknown 26764 1726882733.72418: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882733.72420: variable 'ansible_pipelining' from source: unknown 26764 1726882733.72422: variable 'ansible_timeout' from source: unknown 26764 1726882733.72424: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882733.72522: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 26764 1726882733.72529: variable 'omit' from source: magic vars 26764 1726882733.72535: starting attempt loop 26764 1726882733.72539: running the handler 26764 1726882733.72548: handler run complete 26764 1726882733.72557: attempt loop complete, returning result 26764 1726882733.72559: _execute() done 26764 1726882733.72562: dumping result to json 26764 1726882733.72569: done dumping result, returning 26764 1726882733.72576: done running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag [0e448fcc-3ce9-9875-c9a3-000000000425] 26764 1726882733.72580: sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000425 26764 1726882733.72658: done sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000425 26764 1726882733.72661: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 26764 1726882733.72715: no more pending results, returning what we have 26764 1726882733.72718: results queue empty 26764 1726882733.72719: checking for any_errors_fatal 26764 1726882733.72720: done checking for any_errors_fatal 26764 1726882733.72721: checking for max_fail_percentage 26764 1726882733.72722: done checking for max_fail_percentage 26764 1726882733.72723: checking to see if all hosts have failed and the running result is not ok 26764 1726882733.72724: done checking to see if all hosts have failed 26764 1726882733.72725: getting the remaining hosts for this loop 26764 1726882733.72726: done getting the remaining hosts for this loop 26764 1726882733.72729: getting the next task for host managed_node2 26764 1726882733.72735: done getting next task for host managed_node2 26764 1726882733.72737: ^ task is: TASK: Stat profile file 26764 1726882733.72740: ^ state is: HOST STATE: block=5, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882733.72743: getting variables 26764 1726882733.72744: in VariableManager get_vars() 26764 1726882733.72773: Calling all_inventory to load vars for managed_node2 26764 1726882733.72781: Calling groups_inventory to load vars for managed_node2 26764 1726882733.72784: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882733.72792: Calling all_plugins_play to load vars for managed_node2 26764 1726882733.72794: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882733.72796: Calling groups_plugins_play to load vars for managed_node2 26764 1726882733.73553: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882733.74477: done with get_vars() 26764 1726882733.74491: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 21:38:53 -0400 (0:00:00.033) 0:00:19.687 ****** 26764 1726882733.74551: entering _queue_task() for managed_node2/stat 26764 1726882733.74741: worker is 1 (out of 1 available) 26764 1726882733.74756: exiting _queue_task() for managed_node2/stat 26764 1726882733.74770: done queuing things up, now waiting for results queue to drain 26764 1726882733.74771: waiting for pending results... 26764 1726882733.74931: running TaskExecutor() for managed_node2/TASK: Stat profile file 26764 1726882733.75000: in run() - task 0e448fcc-3ce9-9875-c9a3-000000000426 26764 1726882733.75010: variable 'ansible_search_path' from source: unknown 26764 1726882733.75014: variable 'ansible_search_path' from source: unknown 26764 1726882733.75040: calling self._execute() 26764 1726882733.75113: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882733.75116: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882733.75125: variable 'omit' from source: magic vars 26764 1726882733.75393: variable 'ansible_distribution_major_version' from source: facts 26764 1726882733.75400: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882733.75407: variable 'omit' from source: magic vars 26764 1726882733.75436: variable 'omit' from source: magic vars 26764 1726882733.75505: variable 'profile' from source: play vars 26764 1726882733.75509: variable 'interface' from source: play vars 26764 1726882733.75558: variable 'interface' from source: play vars 26764 1726882733.75574: variable 'omit' from source: magic vars 26764 1726882733.75609: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26764 1726882733.75633: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26764 1726882733.75649: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26764 1726882733.75663: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882733.75678: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882733.75700: variable 'inventory_hostname' from source: host vars for 'managed_node2' 26764 1726882733.75703: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882733.75707: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882733.75777: Set connection var ansible_shell_executable to /bin/sh 26764 1726882733.75781: Set connection var ansible_shell_type to sh 26764 1726882733.75788: Set connection var ansible_module_compression to ZIP_DEFLATED 26764 1726882733.75793: Set connection var ansible_timeout to 10 26764 1726882733.75798: Set connection var ansible_connection to ssh 26764 1726882733.75803: Set connection var ansible_pipelining to False 26764 1726882733.75820: variable 'ansible_shell_executable' from source: unknown 26764 1726882733.75823: variable 'ansible_connection' from source: unknown 26764 1726882733.75826: variable 'ansible_module_compression' from source: unknown 26764 1726882733.75828: variable 'ansible_shell_type' from source: unknown 26764 1726882733.75831: variable 'ansible_shell_executable' from source: unknown 26764 1726882733.75833: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882733.75835: variable 'ansible_pipelining' from source: unknown 26764 1726882733.75837: variable 'ansible_timeout' from source: unknown 26764 1726882733.75845: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882733.75985: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 26764 1726882733.75993: variable 'omit' from source: magic vars 26764 1726882733.75999: starting attempt loop 26764 1726882733.76002: running the handler 26764 1726882733.76012: _low_level_execute_command(): starting 26764 1726882733.76019: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 26764 1726882733.76538: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882733.76547: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882733.76582: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882733.76595: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882733.76650: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882733.76656: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882733.76675: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882733.76791: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882733.78441: stdout chunk (state=3): >>>/root <<< 26764 1726882733.78544: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882733.78592: stderr chunk (state=3): >>><<< 26764 1726882733.78595: stdout chunk (state=3): >>><<< 26764 1726882733.78614: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882733.78624: _low_level_execute_command(): starting 26764 1726882733.78629: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882733.7861292-27610-7377100430684 `" && echo ansible-tmp-1726882733.7861292-27610-7377100430684="` echo /root/.ansible/tmp/ansible-tmp-1726882733.7861292-27610-7377100430684 `" ) && sleep 0' 26764 1726882733.79054: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882733.79060: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882733.79097: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882733.79117: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882733.79159: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882733.79177: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882733.79281: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882733.81138: stdout chunk (state=3): >>>ansible-tmp-1726882733.7861292-27610-7377100430684=/root/.ansible/tmp/ansible-tmp-1726882733.7861292-27610-7377100430684 <<< 26764 1726882733.81249: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882733.81292: stderr chunk (state=3): >>><<< 26764 1726882733.81296: stdout chunk (state=3): >>><<< 26764 1726882733.81308: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882733.7861292-27610-7377100430684=/root/.ansible/tmp/ansible-tmp-1726882733.7861292-27610-7377100430684 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882733.81345: variable 'ansible_module_compression' from source: unknown 26764 1726882733.81389: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-26764trh16hvb/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 26764 1726882733.81419: variable 'ansible_facts' from source: unknown 26764 1726882733.81486: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882733.7861292-27610-7377100430684/AnsiballZ_stat.py 26764 1726882733.81587: Sending initial data 26764 1726882733.81591: Sent initial data (151 bytes) 26764 1726882733.82228: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882733.82233: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882733.82263: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882733.82281: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882733.82333: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882733.82341: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882733.82451: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882733.84168: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 <<< 26764 1726882733.84175: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 26764 1726882733.84265: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 26764 1726882733.84366: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-26764trh16hvb/tmp_m8j1ns_ /root/.ansible/tmp/ansible-tmp-1726882733.7861292-27610-7377100430684/AnsiballZ_stat.py <<< 26764 1726882733.84463: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 26764 1726882733.85484: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882733.85577: stderr chunk (state=3): >>><<< 26764 1726882733.85580: stdout chunk (state=3): >>><<< 26764 1726882733.85595: done transferring module to remote 26764 1726882733.85603: _low_level_execute_command(): starting 26764 1726882733.85608: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882733.7861292-27610-7377100430684/ /root/.ansible/tmp/ansible-tmp-1726882733.7861292-27610-7377100430684/AnsiballZ_stat.py && sleep 0' 26764 1726882733.86017: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882733.86022: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882733.86052: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882733.86066: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882733.86119: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882733.86130: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882733.86236: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882733.87955: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882733.87999: stderr chunk (state=3): >>><<< 26764 1726882733.88002: stdout chunk (state=3): >>><<< 26764 1726882733.88019: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882733.88022: _low_level_execute_command(): starting 26764 1726882733.88029: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882733.7861292-27610-7377100430684/AnsiballZ_stat.py && sleep 0' 26764 1726882733.88447: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882733.88458: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882733.88488: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882733.88500: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882733.88544: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882733.88556: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882733.88678: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882734.01657: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-rpltstbr", "follow": false, "checksum_algorithm": "sha1"}}} <<< 26764 1726882734.02638: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 26764 1726882734.02704: stderr chunk (state=3): >>><<< 26764 1726882734.02707: stdout chunk (state=3): >>><<< 26764 1726882734.02722: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-rpltstbr", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 26764 1726882734.02747: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-rpltstbr', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882733.7861292-27610-7377100430684/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 26764 1726882734.02758: _low_level_execute_command(): starting 26764 1726882734.02762: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882733.7861292-27610-7377100430684/ > /dev/null 2>&1 && sleep 0' 26764 1726882734.03229: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882734.03235: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882734.03267: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882734.03281: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 26764 1726882734.03291: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882734.03341: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882734.03353: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882734.03461: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882734.05258: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882734.05307: stderr chunk (state=3): >>><<< 26764 1726882734.05310: stdout chunk (state=3): >>><<< 26764 1726882734.05326: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882734.05339: handler run complete 26764 1726882734.05356: attempt loop complete, returning result 26764 1726882734.05367: _execute() done 26764 1726882734.05370: dumping result to json 26764 1726882734.05373: done dumping result, returning 26764 1726882734.05379: done running TaskExecutor() for managed_node2/TASK: Stat profile file [0e448fcc-3ce9-9875-c9a3-000000000426] 26764 1726882734.05384: sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000426 26764 1726882734.05477: done sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000426 26764 1726882734.05480: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 26764 1726882734.05533: no more pending results, returning what we have 26764 1726882734.05537: results queue empty 26764 1726882734.05538: checking for any_errors_fatal 26764 1726882734.05545: done checking for any_errors_fatal 26764 1726882734.05546: checking for max_fail_percentage 26764 1726882734.05548: done checking for max_fail_percentage 26764 1726882734.05548: checking to see if all hosts have failed and the running result is not ok 26764 1726882734.05549: done checking to see if all hosts have failed 26764 1726882734.05550: getting the remaining hosts for this loop 26764 1726882734.05551: done getting the remaining hosts for this loop 26764 1726882734.05555: getting the next task for host managed_node2 26764 1726882734.05562: done getting next task for host managed_node2 26764 1726882734.05568: ^ task is: TASK: Set NM profile exist flag based on the profile files 26764 1726882734.05572: ^ state is: HOST STATE: block=5, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882734.05576: getting variables 26764 1726882734.05578: in VariableManager get_vars() 26764 1726882734.05618: Calling all_inventory to load vars for managed_node2 26764 1726882734.05621: Calling groups_inventory to load vars for managed_node2 26764 1726882734.05623: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882734.05634: Calling all_plugins_play to load vars for managed_node2 26764 1726882734.05637: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882734.05640: Calling groups_plugins_play to load vars for managed_node2 26764 1726882734.06599: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882734.07525: done with get_vars() 26764 1726882734.07540: done getting variables 26764 1726882734.07587: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 21:38:54 -0400 (0:00:00.330) 0:00:20.017 ****** 26764 1726882734.07609: entering _queue_task() for managed_node2/set_fact 26764 1726882734.07823: worker is 1 (out of 1 available) 26764 1726882734.07836: exiting _queue_task() for managed_node2/set_fact 26764 1726882734.07849: done queuing things up, now waiting for results queue to drain 26764 1726882734.07850: waiting for pending results... 26764 1726882734.08025: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files 26764 1726882734.08104: in run() - task 0e448fcc-3ce9-9875-c9a3-000000000427 26764 1726882734.08114: variable 'ansible_search_path' from source: unknown 26764 1726882734.08118: variable 'ansible_search_path' from source: unknown 26764 1726882734.08147: calling self._execute() 26764 1726882734.08220: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882734.08225: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882734.08234: variable 'omit' from source: magic vars 26764 1726882734.08510: variable 'ansible_distribution_major_version' from source: facts 26764 1726882734.08519: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882734.08604: variable 'profile_stat' from source: set_fact 26764 1726882734.08617: Evaluated conditional (profile_stat.stat.exists): False 26764 1726882734.08620: when evaluation is False, skipping this task 26764 1726882734.08624: _execute() done 26764 1726882734.08626: dumping result to json 26764 1726882734.08629: done dumping result, returning 26764 1726882734.08631: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files [0e448fcc-3ce9-9875-c9a3-000000000427] 26764 1726882734.08638: sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000427 26764 1726882734.08723: done sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000427 26764 1726882734.08726: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 26764 1726882734.08789: no more pending results, returning what we have 26764 1726882734.08793: results queue empty 26764 1726882734.08794: checking for any_errors_fatal 26764 1726882734.08799: done checking for any_errors_fatal 26764 1726882734.08800: checking for max_fail_percentage 26764 1726882734.08801: done checking for max_fail_percentage 26764 1726882734.08802: checking to see if all hosts have failed and the running result is not ok 26764 1726882734.08803: done checking to see if all hosts have failed 26764 1726882734.08803: getting the remaining hosts for this loop 26764 1726882734.08804: done getting the remaining hosts for this loop 26764 1726882734.08807: getting the next task for host managed_node2 26764 1726882734.08812: done getting next task for host managed_node2 26764 1726882734.08814: ^ task is: TASK: Get NM profile info 26764 1726882734.08818: ^ state is: HOST STATE: block=5, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882734.08821: getting variables 26764 1726882734.08823: in VariableManager get_vars() 26764 1726882734.08860: Calling all_inventory to load vars for managed_node2 26764 1726882734.08862: Calling groups_inventory to load vars for managed_node2 26764 1726882734.08868: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882734.08876: Calling all_plugins_play to load vars for managed_node2 26764 1726882734.08877: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882734.08879: Calling groups_plugins_play to load vars for managed_node2 26764 1726882734.09643: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882734.10582: done with get_vars() 26764 1726882734.10597: done getting variables 26764 1726882734.10659: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 21:38:54 -0400 (0:00:00.030) 0:00:20.048 ****** 26764 1726882734.10685: entering _queue_task() for managed_node2/shell 26764 1726882734.10686: Creating lock for shell 26764 1726882734.10881: worker is 1 (out of 1 available) 26764 1726882734.10895: exiting _queue_task() for managed_node2/shell 26764 1726882734.10906: done queuing things up, now waiting for results queue to drain 26764 1726882734.10907: waiting for pending results... 26764 1726882734.11068: running TaskExecutor() for managed_node2/TASK: Get NM profile info 26764 1726882734.11137: in run() - task 0e448fcc-3ce9-9875-c9a3-000000000428 26764 1726882734.11147: variable 'ansible_search_path' from source: unknown 26764 1726882734.11150: variable 'ansible_search_path' from source: unknown 26764 1726882734.11181: calling self._execute() 26764 1726882734.11251: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882734.11257: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882734.11270: variable 'omit' from source: magic vars 26764 1726882734.11513: variable 'ansible_distribution_major_version' from source: facts 26764 1726882734.11523: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882734.11529: variable 'omit' from source: magic vars 26764 1726882734.11561: variable 'omit' from source: magic vars 26764 1726882734.11626: variable 'profile' from source: play vars 26764 1726882734.11630: variable 'interface' from source: play vars 26764 1726882734.11683: variable 'interface' from source: play vars 26764 1726882734.11696: variable 'omit' from source: magic vars 26764 1726882734.11727: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26764 1726882734.11752: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26764 1726882734.11770: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26764 1726882734.11785: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882734.11794: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882734.11815: variable 'inventory_hostname' from source: host vars for 'managed_node2' 26764 1726882734.11819: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882734.11822: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882734.11893: Set connection var ansible_shell_executable to /bin/sh 26764 1726882734.11897: Set connection var ansible_shell_type to sh 26764 1726882734.11903: Set connection var ansible_module_compression to ZIP_DEFLATED 26764 1726882734.11908: Set connection var ansible_timeout to 10 26764 1726882734.11913: Set connection var ansible_connection to ssh 26764 1726882734.11919: Set connection var ansible_pipelining to False 26764 1726882734.11934: variable 'ansible_shell_executable' from source: unknown 26764 1726882734.11937: variable 'ansible_connection' from source: unknown 26764 1726882734.11939: variable 'ansible_module_compression' from source: unknown 26764 1726882734.11941: variable 'ansible_shell_type' from source: unknown 26764 1726882734.11944: variable 'ansible_shell_executable' from source: unknown 26764 1726882734.11946: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882734.11950: variable 'ansible_pipelining' from source: unknown 26764 1726882734.11952: variable 'ansible_timeout' from source: unknown 26764 1726882734.11956: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882734.12056: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 26764 1726882734.12069: variable 'omit' from source: magic vars 26764 1726882734.12072: starting attempt loop 26764 1726882734.12074: running the handler 26764 1726882734.12083: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 26764 1726882734.12098: _low_level_execute_command(): starting 26764 1726882734.12110: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 26764 1726882734.12626: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882734.12635: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882734.12672: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882734.12688: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 26764 1726882734.12699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882734.12737: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882734.12749: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882734.12867: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882734.14474: stdout chunk (state=3): >>>/root <<< 26764 1726882734.14572: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882734.14623: stderr chunk (state=3): >>><<< 26764 1726882734.14627: stdout chunk (state=3): >>><<< 26764 1726882734.14645: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882734.14656: _low_level_execute_command(): starting 26764 1726882734.14661: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882734.1464436-27619-21802123937419 `" && echo ansible-tmp-1726882734.1464436-27619-21802123937419="` echo /root/.ansible/tmp/ansible-tmp-1726882734.1464436-27619-21802123937419 `" ) && sleep 0' 26764 1726882734.15100: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882734.15110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882734.15139: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 26764 1726882734.15152: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882734.15162: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882734.15216: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882734.15222: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882734.15334: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882734.17204: stdout chunk (state=3): >>>ansible-tmp-1726882734.1464436-27619-21802123937419=/root/.ansible/tmp/ansible-tmp-1726882734.1464436-27619-21802123937419 <<< 26764 1726882734.17311: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882734.17353: stderr chunk (state=3): >>><<< 26764 1726882734.17356: stdout chunk (state=3): >>><<< 26764 1726882734.17372: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882734.1464436-27619-21802123937419=/root/.ansible/tmp/ansible-tmp-1726882734.1464436-27619-21802123937419 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882734.17398: variable 'ansible_module_compression' from source: unknown 26764 1726882734.17434: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-26764trh16hvb/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 26764 1726882734.17462: variable 'ansible_facts' from source: unknown 26764 1726882734.17528: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882734.1464436-27619-21802123937419/AnsiballZ_command.py 26764 1726882734.17632: Sending initial data 26764 1726882734.17635: Sent initial data (155 bytes) 26764 1726882734.18271: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882734.18277: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882734.18288: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882734.18329: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882734.18332: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882734.18335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882734.18393: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882734.18396: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882734.18501: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882734.20238: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 26764 1726882734.20331: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 26764 1726882734.20429: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-26764trh16hvb/tmp0xwol1e2 /root/.ansible/tmp/ansible-tmp-1726882734.1464436-27619-21802123937419/AnsiballZ_command.py <<< 26764 1726882734.20525: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 26764 1726882734.21540: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882734.21627: stderr chunk (state=3): >>><<< 26764 1726882734.21630: stdout chunk (state=3): >>><<< 26764 1726882734.21645: done transferring module to remote 26764 1726882734.21653: _low_level_execute_command(): starting 26764 1726882734.21658: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882734.1464436-27619-21802123937419/ /root/.ansible/tmp/ansible-tmp-1726882734.1464436-27619-21802123937419/AnsiballZ_command.py && sleep 0' 26764 1726882734.22078: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882734.22084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882734.22111: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882734.22123: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882734.22176: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882734.22188: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882734.22294: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882734.24024: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882734.24070: stderr chunk (state=3): >>><<< 26764 1726882734.24076: stdout chunk (state=3): >>><<< 26764 1726882734.24089: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882734.24092: _low_level_execute_command(): starting 26764 1726882734.24096: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882734.1464436-27619-21802123937419/AnsiballZ_command.py && sleep 0' 26764 1726882734.24500: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882734.24504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882734.24533: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882734.24536: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882734.24538: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882734.24587: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882734.24590: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882734.24705: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882734.39486: stdout chunk (state=3): >>> {"changed": true, "stdout": "rpltstbr /etc/NetworkManager/system-connections/rpltstbr.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep rpltstbr | grep /etc", "start": "2024-09-20 21:38:54.374446", "end": "2024-09-20 21:38:54.392816", "delta": "0:00:00.018370", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep rpltstbr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 26764 1726882734.40669: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 26764 1726882734.40723: stderr chunk (state=3): >>><<< 26764 1726882734.40726: stdout chunk (state=3): >>><<< 26764 1726882734.40744: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "rpltstbr /etc/NetworkManager/system-connections/rpltstbr.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep rpltstbr | grep /etc", "start": "2024-09-20 21:38:54.374446", "end": "2024-09-20 21:38:54.392816", "delta": "0:00:00.018370", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep rpltstbr | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 26764 1726882734.40774: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep rpltstbr | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882734.1464436-27619-21802123937419/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 26764 1726882734.40781: _low_level_execute_command(): starting 26764 1726882734.40786: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882734.1464436-27619-21802123937419/ > /dev/null 2>&1 && sleep 0' 26764 1726882734.41239: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882734.41244: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882734.41282: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882734.41301: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 26764 1726882734.41311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882734.41350: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882734.41362: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882734.41477: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882734.43252: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882734.43297: stderr chunk (state=3): >>><<< 26764 1726882734.43300: stdout chunk (state=3): >>><<< 26764 1726882734.43314: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882734.43320: handler run complete 26764 1726882734.43342: Evaluated conditional (False): False 26764 1726882734.43350: attempt loop complete, returning result 26764 1726882734.43352: _execute() done 26764 1726882734.43355: dumping result to json 26764 1726882734.43360: done dumping result, returning 26764 1726882734.43369: done running TaskExecutor() for managed_node2/TASK: Get NM profile info [0e448fcc-3ce9-9875-c9a3-000000000428] 26764 1726882734.43374: sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000428 26764 1726882734.43468: done sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000428 26764 1726882734.43471: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep rpltstbr | grep /etc", "delta": "0:00:00.018370", "end": "2024-09-20 21:38:54.392816", "rc": 0, "start": "2024-09-20 21:38:54.374446" } STDOUT: rpltstbr /etc/NetworkManager/system-connections/rpltstbr.nmconnection 26764 1726882734.43535: no more pending results, returning what we have 26764 1726882734.43539: results queue empty 26764 1726882734.43540: checking for any_errors_fatal 26764 1726882734.43547: done checking for any_errors_fatal 26764 1726882734.43547: checking for max_fail_percentage 26764 1726882734.43549: done checking for max_fail_percentage 26764 1726882734.43550: checking to see if all hosts have failed and the running result is not ok 26764 1726882734.43551: done checking to see if all hosts have failed 26764 1726882734.43552: getting the remaining hosts for this loop 26764 1726882734.43553: done getting the remaining hosts for this loop 26764 1726882734.43556: getting the next task for host managed_node2 26764 1726882734.43567: done getting next task for host managed_node2 26764 1726882734.43569: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 26764 1726882734.43573: ^ state is: HOST STATE: block=5, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882734.43577: getting variables 26764 1726882734.43579: in VariableManager get_vars() 26764 1726882734.43617: Calling all_inventory to load vars for managed_node2 26764 1726882734.43620: Calling groups_inventory to load vars for managed_node2 26764 1726882734.43621: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882734.43631: Calling all_plugins_play to load vars for managed_node2 26764 1726882734.43634: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882734.43636: Calling groups_plugins_play to load vars for managed_node2 26764 1726882734.44578: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882734.45503: done with get_vars() 26764 1726882734.45519: done getting variables 26764 1726882734.45561: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 21:38:54 -0400 (0:00:00.348) 0:00:20.397 ****** 26764 1726882734.45587: entering _queue_task() for managed_node2/set_fact 26764 1726882734.45780: worker is 1 (out of 1 available) 26764 1726882734.45792: exiting _queue_task() for managed_node2/set_fact 26764 1726882734.45804: done queuing things up, now waiting for results queue to drain 26764 1726882734.45805: waiting for pending results... 26764 1726882734.45977: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 26764 1726882734.46054: in run() - task 0e448fcc-3ce9-9875-c9a3-000000000429 26764 1726882734.46064: variable 'ansible_search_path' from source: unknown 26764 1726882734.46068: variable 'ansible_search_path' from source: unknown 26764 1726882734.46098: calling self._execute() 26764 1726882734.46172: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882734.46179: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882734.46188: variable 'omit' from source: magic vars 26764 1726882734.46451: variable 'ansible_distribution_major_version' from source: facts 26764 1726882734.46462: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882734.46553: variable 'nm_profile_exists' from source: set_fact 26764 1726882734.46565: Evaluated conditional (nm_profile_exists.rc == 0): True 26764 1726882734.46575: variable 'omit' from source: magic vars 26764 1726882734.46605: variable 'omit' from source: magic vars 26764 1726882734.46626: variable 'omit' from source: magic vars 26764 1726882734.46657: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26764 1726882734.46689: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26764 1726882734.46706: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26764 1726882734.46719: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882734.46729: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882734.46751: variable 'inventory_hostname' from source: host vars for 'managed_node2' 26764 1726882734.46754: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882734.46758: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882734.46830: Set connection var ansible_shell_executable to /bin/sh 26764 1726882734.46833: Set connection var ansible_shell_type to sh 26764 1726882734.46841: Set connection var ansible_module_compression to ZIP_DEFLATED 26764 1726882734.46846: Set connection var ansible_timeout to 10 26764 1726882734.46851: Set connection var ansible_connection to ssh 26764 1726882734.46856: Set connection var ansible_pipelining to False 26764 1726882734.46876: variable 'ansible_shell_executable' from source: unknown 26764 1726882734.46879: variable 'ansible_connection' from source: unknown 26764 1726882734.46881: variable 'ansible_module_compression' from source: unknown 26764 1726882734.46883: variable 'ansible_shell_type' from source: unknown 26764 1726882734.46886: variable 'ansible_shell_executable' from source: unknown 26764 1726882734.46888: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882734.46892: variable 'ansible_pipelining' from source: unknown 26764 1726882734.46895: variable 'ansible_timeout' from source: unknown 26764 1726882734.46897: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882734.46999: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 26764 1726882734.47012: variable 'omit' from source: magic vars 26764 1726882734.47015: starting attempt loop 26764 1726882734.47019: running the handler 26764 1726882734.47030: handler run complete 26764 1726882734.47039: attempt loop complete, returning result 26764 1726882734.47041: _execute() done 26764 1726882734.47044: dumping result to json 26764 1726882734.47046: done dumping result, returning 26764 1726882734.47053: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0e448fcc-3ce9-9875-c9a3-000000000429] 26764 1726882734.47058: sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000429 26764 1726882734.47142: done sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000429 26764 1726882734.47145: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 26764 1726882734.47194: no more pending results, returning what we have 26764 1726882734.47197: results queue empty 26764 1726882734.47198: checking for any_errors_fatal 26764 1726882734.47204: done checking for any_errors_fatal 26764 1726882734.47204: checking for max_fail_percentage 26764 1726882734.47206: done checking for max_fail_percentage 26764 1726882734.47207: checking to see if all hosts have failed and the running result is not ok 26764 1726882734.47207: done checking to see if all hosts have failed 26764 1726882734.47208: getting the remaining hosts for this loop 26764 1726882734.47210: done getting the remaining hosts for this loop 26764 1726882734.47213: getting the next task for host managed_node2 26764 1726882734.47220: done getting next task for host managed_node2 26764 1726882734.47222: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 26764 1726882734.47229: ^ state is: HOST STATE: block=5, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882734.47233: getting variables 26764 1726882734.47234: in VariableManager get_vars() 26764 1726882734.47269: Calling all_inventory to load vars for managed_node2 26764 1726882734.47271: Calling groups_inventory to load vars for managed_node2 26764 1726882734.47273: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882734.47282: Calling all_plugins_play to load vars for managed_node2 26764 1726882734.47284: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882734.47287: Calling groups_plugins_play to load vars for managed_node2 26764 1726882734.48120: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882734.49048: done with get_vars() 26764 1726882734.49062: done getting variables 26764 1726882734.49103: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 26764 1726882734.49186: variable 'profile' from source: play vars 26764 1726882734.49190: variable 'interface' from source: play vars 26764 1726882734.49231: variable 'interface' from source: play vars TASK [Get the ansible_managed comment in ifcfg-rpltstbr] *********************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 21:38:54 -0400 (0:00:00.036) 0:00:20.434 ****** 26764 1726882734.49256: entering _queue_task() for managed_node2/command 26764 1726882734.49440: worker is 1 (out of 1 available) 26764 1726882734.49453: exiting _queue_task() for managed_node2/command 26764 1726882734.49466: done queuing things up, now waiting for results queue to drain 26764 1726882734.49467: waiting for pending results... 26764 1726882734.49636: running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-rpltstbr 26764 1726882734.49708: in run() - task 0e448fcc-3ce9-9875-c9a3-00000000042b 26764 1726882734.49719: variable 'ansible_search_path' from source: unknown 26764 1726882734.49724: variable 'ansible_search_path' from source: unknown 26764 1726882734.49752: calling self._execute() 26764 1726882734.49824: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882734.49830: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882734.49841: variable 'omit' from source: magic vars 26764 1726882734.50090: variable 'ansible_distribution_major_version' from source: facts 26764 1726882734.50100: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882734.50185: variable 'profile_stat' from source: set_fact 26764 1726882734.50196: Evaluated conditional (profile_stat.stat.exists): False 26764 1726882734.50199: when evaluation is False, skipping this task 26764 1726882734.50201: _execute() done 26764 1726882734.50204: dumping result to json 26764 1726882734.50207: done dumping result, returning 26764 1726882734.50212: done running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-rpltstbr [0e448fcc-3ce9-9875-c9a3-00000000042b] 26764 1726882734.50217: sending task result for task 0e448fcc-3ce9-9875-c9a3-00000000042b 26764 1726882734.50298: done sending task result for task 0e448fcc-3ce9-9875-c9a3-00000000042b 26764 1726882734.50301: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 26764 1726882734.50377: no more pending results, returning what we have 26764 1726882734.50380: results queue empty 26764 1726882734.50381: checking for any_errors_fatal 26764 1726882734.50385: done checking for any_errors_fatal 26764 1726882734.50386: checking for max_fail_percentage 26764 1726882734.50387: done checking for max_fail_percentage 26764 1726882734.50388: checking to see if all hosts have failed and the running result is not ok 26764 1726882734.50389: done checking to see if all hosts have failed 26764 1726882734.50390: getting the remaining hosts for this loop 26764 1726882734.50391: done getting the remaining hosts for this loop 26764 1726882734.50393: getting the next task for host managed_node2 26764 1726882734.50399: done getting next task for host managed_node2 26764 1726882734.50401: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 26764 1726882734.50404: ^ state is: HOST STATE: block=5, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882734.50407: getting variables 26764 1726882734.50408: in VariableManager get_vars() 26764 1726882734.50436: Calling all_inventory to load vars for managed_node2 26764 1726882734.50438: Calling groups_inventory to load vars for managed_node2 26764 1726882734.50439: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882734.50446: Calling all_plugins_play to load vars for managed_node2 26764 1726882734.50448: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882734.50449: Calling groups_plugins_play to load vars for managed_node2 26764 1726882734.51200: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882734.52125: done with get_vars() 26764 1726882734.52138: done getting variables 26764 1726882734.52180: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 26764 1726882734.52249: variable 'profile' from source: play vars 26764 1726882734.52252: variable 'interface' from source: play vars 26764 1726882734.52294: variable 'interface' from source: play vars TASK [Verify the ansible_managed comment in ifcfg-rpltstbr] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 21:38:54 -0400 (0:00:00.030) 0:00:20.464 ****** 26764 1726882734.52315: entering _queue_task() for managed_node2/set_fact 26764 1726882734.52488: worker is 1 (out of 1 available) 26764 1726882734.52503: exiting _queue_task() for managed_node2/set_fact 26764 1726882734.52517: done queuing things up, now waiting for results queue to drain 26764 1726882734.52518: waiting for pending results... 26764 1726882734.52677: running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-rpltstbr 26764 1726882734.52745: in run() - task 0e448fcc-3ce9-9875-c9a3-00000000042c 26764 1726882734.52754: variable 'ansible_search_path' from source: unknown 26764 1726882734.52758: variable 'ansible_search_path' from source: unknown 26764 1726882734.52789: calling self._execute() 26764 1726882734.52858: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882734.52865: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882734.52876: variable 'omit' from source: magic vars 26764 1726882734.53117: variable 'ansible_distribution_major_version' from source: facts 26764 1726882734.53126: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882734.53211: variable 'profile_stat' from source: set_fact 26764 1726882734.53220: Evaluated conditional (profile_stat.stat.exists): False 26764 1726882734.53223: when evaluation is False, skipping this task 26764 1726882734.53225: _execute() done 26764 1726882734.53228: dumping result to json 26764 1726882734.53231: done dumping result, returning 26764 1726882734.53238: done running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-rpltstbr [0e448fcc-3ce9-9875-c9a3-00000000042c] 26764 1726882734.53242: sending task result for task 0e448fcc-3ce9-9875-c9a3-00000000042c 26764 1726882734.53326: done sending task result for task 0e448fcc-3ce9-9875-c9a3-00000000042c 26764 1726882734.53328: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 26764 1726882734.53379: no more pending results, returning what we have 26764 1726882734.53382: results queue empty 26764 1726882734.53382: checking for any_errors_fatal 26764 1726882734.53386: done checking for any_errors_fatal 26764 1726882734.53387: checking for max_fail_percentage 26764 1726882734.53388: done checking for max_fail_percentage 26764 1726882734.53389: checking to see if all hosts have failed and the running result is not ok 26764 1726882734.53390: done checking to see if all hosts have failed 26764 1726882734.53391: getting the remaining hosts for this loop 26764 1726882734.53392: done getting the remaining hosts for this loop 26764 1726882734.53394: getting the next task for host managed_node2 26764 1726882734.53400: done getting next task for host managed_node2 26764 1726882734.53402: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 26764 1726882734.53406: ^ state is: HOST STATE: block=5, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882734.53409: getting variables 26764 1726882734.53410: in VariableManager get_vars() 26764 1726882734.53439: Calling all_inventory to load vars for managed_node2 26764 1726882734.53441: Calling groups_inventory to load vars for managed_node2 26764 1726882734.53442: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882734.53449: Calling all_plugins_play to load vars for managed_node2 26764 1726882734.53451: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882734.53453: Calling groups_plugins_play to load vars for managed_node2 26764 1726882734.54459: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882734.56089: done with get_vars() 26764 1726882734.56109: done getting variables 26764 1726882734.56165: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 26764 1726882734.56265: variable 'profile' from source: play vars 26764 1726882734.56268: variable 'interface' from source: play vars 26764 1726882734.56325: variable 'interface' from source: play vars TASK [Get the fingerprint comment in ifcfg-rpltstbr] *************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 21:38:54 -0400 (0:00:00.040) 0:00:20.505 ****** 26764 1726882734.56352: entering _queue_task() for managed_node2/command 26764 1726882734.56588: worker is 1 (out of 1 available) 26764 1726882734.56601: exiting _queue_task() for managed_node2/command 26764 1726882734.56614: done queuing things up, now waiting for results queue to drain 26764 1726882734.56615: waiting for pending results... 26764 1726882734.56888: running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-rpltstbr 26764 1726882734.57008: in run() - task 0e448fcc-3ce9-9875-c9a3-00000000042d 26764 1726882734.57031: variable 'ansible_search_path' from source: unknown 26764 1726882734.57040: variable 'ansible_search_path' from source: unknown 26764 1726882734.57086: calling self._execute() 26764 1726882734.57177: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882734.57182: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882734.57190: variable 'omit' from source: magic vars 26764 1726882734.57431: variable 'ansible_distribution_major_version' from source: facts 26764 1726882734.57441: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882734.57524: variable 'profile_stat' from source: set_fact 26764 1726882734.57534: Evaluated conditional (profile_stat.stat.exists): False 26764 1726882734.57537: when evaluation is False, skipping this task 26764 1726882734.57539: _execute() done 26764 1726882734.57541: dumping result to json 26764 1726882734.57546: done dumping result, returning 26764 1726882734.57552: done running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-rpltstbr [0e448fcc-3ce9-9875-c9a3-00000000042d] 26764 1726882734.57557: sending task result for task 0e448fcc-3ce9-9875-c9a3-00000000042d 26764 1726882734.57635: done sending task result for task 0e448fcc-3ce9-9875-c9a3-00000000042d 26764 1726882734.57638: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 26764 1726882734.57683: no more pending results, returning what we have 26764 1726882734.57687: results queue empty 26764 1726882734.57688: checking for any_errors_fatal 26764 1726882734.57692: done checking for any_errors_fatal 26764 1726882734.57693: checking for max_fail_percentage 26764 1726882734.57694: done checking for max_fail_percentage 26764 1726882734.57695: checking to see if all hosts have failed and the running result is not ok 26764 1726882734.57696: done checking to see if all hosts have failed 26764 1726882734.57697: getting the remaining hosts for this loop 26764 1726882734.57698: done getting the remaining hosts for this loop 26764 1726882734.57701: getting the next task for host managed_node2 26764 1726882734.57706: done getting next task for host managed_node2 26764 1726882734.57708: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 26764 1726882734.57712: ^ state is: HOST STATE: block=5, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882734.57715: getting variables 26764 1726882734.57716: in VariableManager get_vars() 26764 1726882734.57744: Calling all_inventory to load vars for managed_node2 26764 1726882734.57746: Calling groups_inventory to load vars for managed_node2 26764 1726882734.57748: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882734.57757: Calling all_plugins_play to load vars for managed_node2 26764 1726882734.57759: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882734.57762: Calling groups_plugins_play to load vars for managed_node2 26764 1726882734.58523: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882734.59547: done with get_vars() 26764 1726882734.59561: done getting variables 26764 1726882734.59604: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 26764 1726882734.59676: variable 'profile' from source: play vars 26764 1726882734.59679: variable 'interface' from source: play vars 26764 1726882734.59716: variable 'interface' from source: play vars TASK [Verify the fingerprint comment in ifcfg-rpltstbr] ************************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 21:38:54 -0400 (0:00:00.033) 0:00:20.539 ****** 26764 1726882734.59737: entering _queue_task() for managed_node2/set_fact 26764 1726882734.59908: worker is 1 (out of 1 available) 26764 1726882734.59919: exiting _queue_task() for managed_node2/set_fact 26764 1726882734.59930: done queuing things up, now waiting for results queue to drain 26764 1726882734.59931: waiting for pending results... 26764 1726882734.60091: running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-rpltstbr 26764 1726882734.60148: in run() - task 0e448fcc-3ce9-9875-c9a3-00000000042e 26764 1726882734.60168: variable 'ansible_search_path' from source: unknown 26764 1726882734.60171: variable 'ansible_search_path' from source: unknown 26764 1726882734.60199: calling self._execute() 26764 1726882734.60274: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882734.60281: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882734.60291: variable 'omit' from source: magic vars 26764 1726882734.60536: variable 'ansible_distribution_major_version' from source: facts 26764 1726882734.60546: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882734.60633: variable 'profile_stat' from source: set_fact 26764 1726882734.60641: Evaluated conditional (profile_stat.stat.exists): False 26764 1726882734.60644: when evaluation is False, skipping this task 26764 1726882734.60647: _execute() done 26764 1726882734.60650: dumping result to json 26764 1726882734.60654: done dumping result, returning 26764 1726882734.60660: done running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-rpltstbr [0e448fcc-3ce9-9875-c9a3-00000000042e] 26764 1726882734.60666: sending task result for task 0e448fcc-3ce9-9875-c9a3-00000000042e 26764 1726882734.60745: done sending task result for task 0e448fcc-3ce9-9875-c9a3-00000000042e 26764 1726882734.60748: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 26764 1726882734.60797: no more pending results, returning what we have 26764 1726882734.60800: results queue empty 26764 1726882734.60801: checking for any_errors_fatal 26764 1726882734.60805: done checking for any_errors_fatal 26764 1726882734.60806: checking for max_fail_percentage 26764 1726882734.60808: done checking for max_fail_percentage 26764 1726882734.60808: checking to see if all hosts have failed and the running result is not ok 26764 1726882734.60809: done checking to see if all hosts have failed 26764 1726882734.60810: getting the remaining hosts for this loop 26764 1726882734.60811: done getting the remaining hosts for this loop 26764 1726882734.60814: getting the next task for host managed_node2 26764 1726882734.60821: done getting next task for host managed_node2 26764 1726882734.60823: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 26764 1726882734.60826: ^ state is: HOST STATE: block=5, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882734.60829: getting variables 26764 1726882734.60830: in VariableManager get_vars() 26764 1726882734.60868: Calling all_inventory to load vars for managed_node2 26764 1726882734.60871: Calling groups_inventory to load vars for managed_node2 26764 1726882734.60872: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882734.60879: Calling all_plugins_play to load vars for managed_node2 26764 1726882734.60881: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882734.60883: Calling groups_plugins_play to load vars for managed_node2 26764 1726882734.61625: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882734.62556: done with get_vars() 26764 1726882734.62574: done getting variables 26764 1726882734.62613: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 26764 1726882734.62686: variable 'profile' from source: play vars 26764 1726882734.62688: variable 'interface' from source: play vars 26764 1726882734.62726: variable 'interface' from source: play vars TASK [Assert that the profile is present - 'rpltstbr'] ************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 21:38:54 -0400 (0:00:00.030) 0:00:20.569 ****** 26764 1726882734.62746: entering _queue_task() for managed_node2/assert 26764 1726882734.62909: worker is 1 (out of 1 available) 26764 1726882734.62921: exiting _queue_task() for managed_node2/assert 26764 1726882734.62932: done queuing things up, now waiting for results queue to drain 26764 1726882734.62933: waiting for pending results... 26764 1726882734.63095: running TaskExecutor() for managed_node2/TASK: Assert that the profile is present - 'rpltstbr' 26764 1726882734.63160: in run() - task 0e448fcc-3ce9-9875-c9a3-000000000397 26764 1726882734.63175: variable 'ansible_search_path' from source: unknown 26764 1726882734.63178: variable 'ansible_search_path' from source: unknown 26764 1726882734.63204: calling self._execute() 26764 1726882734.63273: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882734.63277: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882734.63287: variable 'omit' from source: magic vars 26764 1726882734.63522: variable 'ansible_distribution_major_version' from source: facts 26764 1726882734.63532: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882734.63538: variable 'omit' from source: magic vars 26764 1726882734.63562: variable 'omit' from source: magic vars 26764 1726882734.63630: variable 'profile' from source: play vars 26764 1726882734.63633: variable 'interface' from source: play vars 26764 1726882734.63679: variable 'interface' from source: play vars 26764 1726882734.63694: variable 'omit' from source: magic vars 26764 1726882734.63727: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26764 1726882734.63751: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26764 1726882734.63770: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26764 1726882734.63783: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882734.63794: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882734.63818: variable 'inventory_hostname' from source: host vars for 'managed_node2' 26764 1726882734.63821: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882734.63823: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882734.63893: Set connection var ansible_shell_executable to /bin/sh 26764 1726882734.63897: Set connection var ansible_shell_type to sh 26764 1726882734.63908: Set connection var ansible_module_compression to ZIP_DEFLATED 26764 1726882734.63911: Set connection var ansible_timeout to 10 26764 1726882734.63914: Set connection var ansible_connection to ssh 26764 1726882734.63919: Set connection var ansible_pipelining to False 26764 1726882734.63938: variable 'ansible_shell_executable' from source: unknown 26764 1726882734.63941: variable 'ansible_connection' from source: unknown 26764 1726882734.63943: variable 'ansible_module_compression' from source: unknown 26764 1726882734.63946: variable 'ansible_shell_type' from source: unknown 26764 1726882734.63948: variable 'ansible_shell_executable' from source: unknown 26764 1726882734.63950: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882734.63954: variable 'ansible_pipelining' from source: unknown 26764 1726882734.63956: variable 'ansible_timeout' from source: unknown 26764 1726882734.63960: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882734.64061: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 26764 1726882734.64073: variable 'omit' from source: magic vars 26764 1726882734.64079: starting attempt loop 26764 1726882734.64082: running the handler 26764 1726882734.64155: variable 'lsr_net_profile_exists' from source: set_fact 26764 1726882734.64159: Evaluated conditional (lsr_net_profile_exists): True 26764 1726882734.64166: handler run complete 26764 1726882734.64180: attempt loop complete, returning result 26764 1726882734.64183: _execute() done 26764 1726882734.64185: dumping result to json 26764 1726882734.64188: done dumping result, returning 26764 1726882734.64194: done running TaskExecutor() for managed_node2/TASK: Assert that the profile is present - 'rpltstbr' [0e448fcc-3ce9-9875-c9a3-000000000397] 26764 1726882734.64199: sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000397 26764 1726882734.64280: done sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000397 26764 1726882734.64283: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 26764 1726882734.64324: no more pending results, returning what we have 26764 1726882734.64327: results queue empty 26764 1726882734.64328: checking for any_errors_fatal 26764 1726882734.64332: done checking for any_errors_fatal 26764 1726882734.64332: checking for max_fail_percentage 26764 1726882734.64334: done checking for max_fail_percentage 26764 1726882734.64335: checking to see if all hosts have failed and the running result is not ok 26764 1726882734.64335: done checking to see if all hosts have failed 26764 1726882734.64336: getting the remaining hosts for this loop 26764 1726882734.64341: done getting the remaining hosts for this loop 26764 1726882734.64344: getting the next task for host managed_node2 26764 1726882734.64353: done getting next task for host managed_node2 26764 1726882734.64355: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 26764 1726882734.64358: ^ state is: HOST STATE: block=5, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882734.64361: getting variables 26764 1726882734.64362: in VariableManager get_vars() 26764 1726882734.64393: Calling all_inventory to load vars for managed_node2 26764 1726882734.64396: Calling groups_inventory to load vars for managed_node2 26764 1726882734.64398: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882734.64405: Calling all_plugins_play to load vars for managed_node2 26764 1726882734.64407: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882734.64408: Calling groups_plugins_play to load vars for managed_node2 26764 1726882734.65245: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882734.66152: done with get_vars() 26764 1726882734.66168: done getting variables 26764 1726882734.66207: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 26764 1726882734.66278: variable 'profile' from source: play vars 26764 1726882734.66281: variable 'interface' from source: play vars 26764 1726882734.66319: variable 'interface' from source: play vars TASK [Assert that the ansible managed comment is present in 'rpltstbr'] ******** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 21:38:54 -0400 (0:00:00.035) 0:00:20.605 ****** 26764 1726882734.66342: entering _queue_task() for managed_node2/assert 26764 1726882734.66508: worker is 1 (out of 1 available) 26764 1726882734.66519: exiting _queue_task() for managed_node2/assert 26764 1726882734.66532: done queuing things up, now waiting for results queue to drain 26764 1726882734.66532: waiting for pending results... 26764 1726882734.66689: running TaskExecutor() for managed_node2/TASK: Assert that the ansible managed comment is present in 'rpltstbr' 26764 1726882734.66751: in run() - task 0e448fcc-3ce9-9875-c9a3-000000000398 26764 1726882734.66761: variable 'ansible_search_path' from source: unknown 26764 1726882734.66772: variable 'ansible_search_path' from source: unknown 26764 1726882734.66794: calling self._execute() 26764 1726882734.66858: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882734.66863: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882734.66876: variable 'omit' from source: magic vars 26764 1726882734.67107: variable 'ansible_distribution_major_version' from source: facts 26764 1726882734.67116: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882734.67122: variable 'omit' from source: magic vars 26764 1726882734.67147: variable 'omit' from source: magic vars 26764 1726882734.67215: variable 'profile' from source: play vars 26764 1726882734.67218: variable 'interface' from source: play vars 26764 1726882734.67261: variable 'interface' from source: play vars 26764 1726882734.67282: variable 'omit' from source: magic vars 26764 1726882734.67308: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26764 1726882734.67333: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26764 1726882734.67347: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26764 1726882734.67359: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882734.67372: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882734.67395: variable 'inventory_hostname' from source: host vars for 'managed_node2' 26764 1726882734.67398: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882734.67400: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882734.67465: Set connection var ansible_shell_executable to /bin/sh 26764 1726882734.67468: Set connection var ansible_shell_type to sh 26764 1726882734.67478: Set connection var ansible_module_compression to ZIP_DEFLATED 26764 1726882734.67483: Set connection var ansible_timeout to 10 26764 1726882734.67488: Set connection var ansible_connection to ssh 26764 1726882734.67499: Set connection var ansible_pipelining to False 26764 1726882734.67511: variable 'ansible_shell_executable' from source: unknown 26764 1726882734.67513: variable 'ansible_connection' from source: unknown 26764 1726882734.67517: variable 'ansible_module_compression' from source: unknown 26764 1726882734.67520: variable 'ansible_shell_type' from source: unknown 26764 1726882734.67522: variable 'ansible_shell_executable' from source: unknown 26764 1726882734.67524: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882734.67526: variable 'ansible_pipelining' from source: unknown 26764 1726882734.67529: variable 'ansible_timeout' from source: unknown 26764 1726882734.67533: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882734.67634: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 26764 1726882734.67643: variable 'omit' from source: magic vars 26764 1726882734.67649: starting attempt loop 26764 1726882734.67652: running the handler 26764 1726882734.67729: variable 'lsr_net_profile_ansible_managed' from source: set_fact 26764 1726882734.67733: Evaluated conditional (lsr_net_profile_ansible_managed): True 26764 1726882734.67739: handler run complete 26764 1726882734.67750: attempt loop complete, returning result 26764 1726882734.67753: _execute() done 26764 1726882734.67756: dumping result to json 26764 1726882734.67759: done dumping result, returning 26764 1726882734.67768: done running TaskExecutor() for managed_node2/TASK: Assert that the ansible managed comment is present in 'rpltstbr' [0e448fcc-3ce9-9875-c9a3-000000000398] 26764 1726882734.67771: sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000398 26764 1726882734.67849: done sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000398 26764 1726882734.67854: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 26764 1726882734.67899: no more pending results, returning what we have 26764 1726882734.67901: results queue empty 26764 1726882734.67902: checking for any_errors_fatal 26764 1726882734.67906: done checking for any_errors_fatal 26764 1726882734.67907: checking for max_fail_percentage 26764 1726882734.67908: done checking for max_fail_percentage 26764 1726882734.67909: checking to see if all hosts have failed and the running result is not ok 26764 1726882734.67909: done checking to see if all hosts have failed 26764 1726882734.67910: getting the remaining hosts for this loop 26764 1726882734.67911: done getting the remaining hosts for this loop 26764 1726882734.67914: getting the next task for host managed_node2 26764 1726882734.67919: done getting next task for host managed_node2 26764 1726882734.67921: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 26764 1726882734.67923: ^ state is: HOST STATE: block=5, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882734.67933: getting variables 26764 1726882734.67935: in VariableManager get_vars() 26764 1726882734.67967: Calling all_inventory to load vars for managed_node2 26764 1726882734.67969: Calling groups_inventory to load vars for managed_node2 26764 1726882734.67971: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882734.67978: Calling all_plugins_play to load vars for managed_node2 26764 1726882734.67980: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882734.67981: Calling groups_plugins_play to load vars for managed_node2 26764 1726882734.68730: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882734.69736: done with get_vars() 26764 1726882734.69749: done getting variables 26764 1726882734.69791: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 26764 1726882734.69856: variable 'profile' from source: play vars 26764 1726882734.69858: variable 'interface' from source: play vars 26764 1726882734.69901: variable 'interface' from source: play vars TASK [Assert that the fingerprint comment is present in rpltstbr] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 21:38:54 -0400 (0:00:00.035) 0:00:20.640 ****** 26764 1726882734.69923: entering _queue_task() for managed_node2/assert 26764 1726882734.70083: worker is 1 (out of 1 available) 26764 1726882734.70095: exiting _queue_task() for managed_node2/assert 26764 1726882734.70107: done queuing things up, now waiting for results queue to drain 26764 1726882734.70108: waiting for pending results... 26764 1726882734.70254: running TaskExecutor() for managed_node2/TASK: Assert that the fingerprint comment is present in rpltstbr 26764 1726882734.70316: in run() - task 0e448fcc-3ce9-9875-c9a3-000000000399 26764 1726882734.70323: variable 'ansible_search_path' from source: unknown 26764 1726882734.70331: variable 'ansible_search_path' from source: unknown 26764 1726882734.70358: calling self._execute() 26764 1726882734.70428: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882734.70439: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882734.70448: variable 'omit' from source: magic vars 26764 1726882734.70683: variable 'ansible_distribution_major_version' from source: facts 26764 1726882734.70692: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882734.70698: variable 'omit' from source: magic vars 26764 1726882734.70722: variable 'omit' from source: magic vars 26764 1726882734.70789: variable 'profile' from source: play vars 26764 1726882734.70793: variable 'interface' from source: play vars 26764 1726882734.70836: variable 'interface' from source: play vars 26764 1726882734.70849: variable 'omit' from source: magic vars 26764 1726882734.70883: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26764 1726882734.70906: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26764 1726882734.70920: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26764 1726882734.70932: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882734.70942: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882734.70966: variable 'inventory_hostname' from source: host vars for 'managed_node2' 26764 1726882734.70969: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882734.70973: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882734.71037: Set connection var ansible_shell_executable to /bin/sh 26764 1726882734.71040: Set connection var ansible_shell_type to sh 26764 1726882734.71048: Set connection var ansible_module_compression to ZIP_DEFLATED 26764 1726882734.71052: Set connection var ansible_timeout to 10 26764 1726882734.71057: Set connection var ansible_connection to ssh 26764 1726882734.71062: Set connection var ansible_pipelining to False 26764 1726882734.71083: variable 'ansible_shell_executable' from source: unknown 26764 1726882734.71086: variable 'ansible_connection' from source: unknown 26764 1726882734.71089: variable 'ansible_module_compression' from source: unknown 26764 1726882734.71091: variable 'ansible_shell_type' from source: unknown 26764 1726882734.71093: variable 'ansible_shell_executable' from source: unknown 26764 1726882734.71096: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882734.71098: variable 'ansible_pipelining' from source: unknown 26764 1726882734.71100: variable 'ansible_timeout' from source: unknown 26764 1726882734.71103: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882734.71193: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 26764 1726882734.71202: variable 'omit' from source: magic vars 26764 1726882734.71208: starting attempt loop 26764 1726882734.71210: running the handler 26764 1726882734.71282: variable 'lsr_net_profile_fingerprint' from source: set_fact 26764 1726882734.71285: Evaluated conditional (lsr_net_profile_fingerprint): True 26764 1726882734.71291: handler run complete 26764 1726882734.71302: attempt loop complete, returning result 26764 1726882734.71305: _execute() done 26764 1726882734.71307: dumping result to json 26764 1726882734.71310: done dumping result, returning 26764 1726882734.71315: done running TaskExecutor() for managed_node2/TASK: Assert that the fingerprint comment is present in rpltstbr [0e448fcc-3ce9-9875-c9a3-000000000399] 26764 1726882734.71321: sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000399 26764 1726882734.71398: done sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000399 26764 1726882734.71401: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 26764 1726882734.71474: no more pending results, returning what we have 26764 1726882734.71477: results queue empty 26764 1726882734.71477: checking for any_errors_fatal 26764 1726882734.71481: done checking for any_errors_fatal 26764 1726882734.71482: checking for max_fail_percentage 26764 1726882734.71484: done checking for max_fail_percentage 26764 1726882734.71485: checking to see if all hosts have failed and the running result is not ok 26764 1726882734.71485: done checking to see if all hosts have failed 26764 1726882734.71486: getting the remaining hosts for this loop 26764 1726882734.71487: done getting the remaining hosts for this loop 26764 1726882734.71490: getting the next task for host managed_node2 26764 1726882734.71495: done getting next task for host managed_node2 26764 1726882734.71496: ^ task is: TASK: Get network_connections output 26764 1726882734.71497: ^ state is: HOST STATE: block=5, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882734.71500: getting variables 26764 1726882734.71501: in VariableManager get_vars() 26764 1726882734.71522: Calling all_inventory to load vars for managed_node2 26764 1726882734.71524: Calling groups_inventory to load vars for managed_node2 26764 1726882734.71526: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882734.71532: Calling all_plugins_play to load vars for managed_node2 26764 1726882734.71534: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882734.71535: Calling groups_plugins_play to load vars for managed_node2 26764 1726882734.72280: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882734.73198: done with get_vars() 26764 1726882734.73212: done getting variables TASK [Get network_connections output] ****************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_reapply.yml:43 Friday 20 September 2024 21:38:54 -0400 (0:00:00.033) 0:00:20.674 ****** 26764 1726882734.73259: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 26764 1726882734.73419: worker is 1 (out of 1 available) 26764 1726882734.73431: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 26764 1726882734.73443: done queuing things up, now waiting for results queue to drain 26764 1726882734.73444: waiting for pending results... 26764 1726882734.73590: running TaskExecutor() for managed_node2/TASK: Get network_connections output 26764 1726882734.73642: in run() - task 0e448fcc-3ce9-9875-c9a3-00000000002d 26764 1726882734.73652: variable 'ansible_search_path' from source: unknown 26764 1726882734.73682: calling self._execute() 26764 1726882734.73742: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882734.73746: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882734.73754: variable 'omit' from source: magic vars 26764 1726882734.73987: variable 'ansible_distribution_major_version' from source: facts 26764 1726882734.73997: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882734.74005: variable 'omit' from source: magic vars 26764 1726882734.74025: variable 'omit' from source: magic vars 26764 1726882734.74049: variable 'interface' from source: play vars 26764 1726882734.74096: variable 'interface' from source: play vars 26764 1726882734.74111: variable 'omit' from source: magic vars 26764 1726882734.74143: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26764 1726882734.74170: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26764 1726882734.74188: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26764 1726882734.74201: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882734.74210: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882734.74234: variable 'inventory_hostname' from source: host vars for 'managed_node2' 26764 1726882734.74237: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882734.74239: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882734.74313: Set connection var ansible_shell_executable to /bin/sh 26764 1726882734.74316: Set connection var ansible_shell_type to sh 26764 1726882734.74324: Set connection var ansible_module_compression to ZIP_DEFLATED 26764 1726882734.74329: Set connection var ansible_timeout to 10 26764 1726882734.74336: Set connection var ansible_connection to ssh 26764 1726882734.74341: Set connection var ansible_pipelining to False 26764 1726882734.74359: variable 'ansible_shell_executable' from source: unknown 26764 1726882734.74362: variable 'ansible_connection' from source: unknown 26764 1726882734.74369: variable 'ansible_module_compression' from source: unknown 26764 1726882734.74371: variable 'ansible_shell_type' from source: unknown 26764 1726882734.74374: variable 'ansible_shell_executable' from source: unknown 26764 1726882734.74376: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882734.74381: variable 'ansible_pipelining' from source: unknown 26764 1726882734.74384: variable 'ansible_timeout' from source: unknown 26764 1726882734.74388: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882734.74514: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 26764 1726882734.74522: variable 'omit' from source: magic vars 26764 1726882734.74528: starting attempt loop 26764 1726882734.74531: running the handler 26764 1726882734.74541: _low_level_execute_command(): starting 26764 1726882734.74549: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 26764 1726882734.75051: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882734.75071: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882734.75085: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882734.75100: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 26764 1726882734.75114: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882734.75152: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882734.75169: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882734.75288: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882734.76947: stdout chunk (state=3): >>>/root <<< 26764 1726882734.77051: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882734.77099: stderr chunk (state=3): >>><<< 26764 1726882734.77102: stdout chunk (state=3): >>><<< 26764 1726882734.77121: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882734.77131: _low_level_execute_command(): starting 26764 1726882734.77137: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882734.7712035-27642-138872408643907 `" && echo ansible-tmp-1726882734.7712035-27642-138872408643907="` echo /root/.ansible/tmp/ansible-tmp-1726882734.7712035-27642-138872408643907 `" ) && sleep 0' 26764 1726882734.77556: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882734.77574: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882734.77587: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 26764 1726882734.77615: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882734.77651: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882734.77667: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882734.77774: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882734.79626: stdout chunk (state=3): >>>ansible-tmp-1726882734.7712035-27642-138872408643907=/root/.ansible/tmp/ansible-tmp-1726882734.7712035-27642-138872408643907 <<< 26764 1726882734.79740: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882734.79788: stderr chunk (state=3): >>><<< 26764 1726882734.79791: stdout chunk (state=3): >>><<< 26764 1726882734.79803: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882734.7712035-27642-138872408643907=/root/.ansible/tmp/ansible-tmp-1726882734.7712035-27642-138872408643907 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882734.79834: variable 'ansible_module_compression' from source: unknown 26764 1726882734.79880: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-26764trh16hvb/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 26764 1726882734.79918: variable 'ansible_facts' from source: unknown 26764 1726882734.80008: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882734.7712035-27642-138872408643907/AnsiballZ_network_connections.py 26764 1726882734.80109: Sending initial data 26764 1726882734.80118: Sent initial data (168 bytes) 26764 1726882734.80748: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882734.80751: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882734.80786: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 26764 1726882734.80789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882734.80792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882734.80839: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882734.80843: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882734.80948: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882734.82653: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 26764 1726882734.82747: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 26764 1726882734.82843: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-26764trh16hvb/tmpbjqkhl4e /root/.ansible/tmp/ansible-tmp-1726882734.7712035-27642-138872408643907/AnsiballZ_network_connections.py <<< 26764 1726882734.82939: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 26764 1726882734.84276: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882734.84358: stderr chunk (state=3): >>><<< 26764 1726882734.84361: stdout chunk (state=3): >>><<< 26764 1726882734.84378: done transferring module to remote 26764 1726882734.84386: _low_level_execute_command(): starting 26764 1726882734.84389: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882734.7712035-27642-138872408643907/ /root/.ansible/tmp/ansible-tmp-1726882734.7712035-27642-138872408643907/AnsiballZ_network_connections.py && sleep 0' 26764 1726882734.84788: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882734.84801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882734.84818: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882734.84830: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882734.84882: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882734.84893: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882734.84996: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882734.86732: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882734.86774: stderr chunk (state=3): >>><<< 26764 1726882734.86777: stdout chunk (state=3): >>><<< 26764 1726882734.86791: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882734.86797: _low_level_execute_command(): starting 26764 1726882734.86802: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882734.7712035-27642-138872408643907/AnsiballZ_network_connections.py && sleep 0' 26764 1726882734.87196: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882734.87209: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882734.87234: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 26764 1726882734.87244: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882734.87293: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882734.87299: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882734.87411: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882735.11212: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'rpltstbr': update connection rpltstbr, 19accc22-6d88-4131-bb4c-c8b05397a153\n[004] #0, state:up persistent_state:present, 'rpltstbr': up connection rpltstbr, 19accc22-6d88-4131-bb4c-c8b05397a153 (is-modified)\n[005] #0, state:up persistent_state:present, 'rpltstbr': connection reapplied\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "rpltstbr", "state": "up", "type": "bridge", "ip": {"address": ["192.0.2.72/31"], "dhcp4": false, "auto6": false}}], "__header": "# Ansible managed test header", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "rpltstbr", "state": "up", "type": "bridge", "ip": {"address": ["192.0.2.72/31"], "dhcp4": false, "auto6": false}}], "__header": "# Ansible managed test header", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 26764 1726882735.12783: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 26764 1726882735.12860: stderr chunk (state=3): >>><<< 26764 1726882735.12872: stdout chunk (state=3): >>><<< 26764 1726882735.12896: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'rpltstbr': update connection rpltstbr, 19accc22-6d88-4131-bb4c-c8b05397a153\n[004] #0, state:up persistent_state:present, 'rpltstbr': up connection rpltstbr, 19accc22-6d88-4131-bb4c-c8b05397a153 (is-modified)\n[005] #0, state:up persistent_state:present, 'rpltstbr': connection reapplied\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "rpltstbr", "state": "up", "type": "bridge", "ip": {"address": ["192.0.2.72/31"], "dhcp4": false, "auto6": false}}], "__header": "# Ansible managed test header", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "rpltstbr", "state": "up", "type": "bridge", "ip": {"address": ["192.0.2.72/31"], "dhcp4": false, "auto6": false}}], "__header": "# Ansible managed test header", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 26764 1726882735.12949: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'rpltstbr', 'state': 'up', 'type': 'bridge', 'ip': {'address': ['192.0.2.72/31'], 'dhcp4': False, 'auto6': False}}], '__header': '# Ansible managed test header', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882734.7712035-27642-138872408643907/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 26764 1726882735.12957: _low_level_execute_command(): starting 26764 1726882735.12962: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882734.7712035-27642-138872408643907/ > /dev/null 2>&1 && sleep 0' 26764 1726882735.13597: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882735.13607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882735.13638: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882735.13643: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882735.13646: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882735.13702: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882735.13708: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882735.13710: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882735.13806: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882735.15775: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882735.15834: stderr chunk (state=3): >>><<< 26764 1726882735.15839: stdout chunk (state=3): >>><<< 26764 1726882735.15856: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882735.15862: handler run complete 26764 1726882735.15905: attempt loop complete, returning result 26764 1726882735.15908: _execute() done 26764 1726882735.15910: dumping result to json 26764 1726882735.15916: done dumping result, returning 26764 1726882735.15924: done running TaskExecutor() for managed_node2/TASK: Get network_connections output [0e448fcc-3ce9-9875-c9a3-00000000002d] 26764 1726882735.15929: sending task result for task 0e448fcc-3ce9-9875-c9a3-00000000002d 26764 1726882735.16047: done sending task result for task 0e448fcc-3ce9-9875-c9a3-00000000002d 26764 1726882735.16049: WORKER PROCESS EXITING changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "# Ansible managed test header", "connections": [ { "ip": { "address": [ "192.0.2.72/31" ], "auto6": false, "dhcp4": false }, "name": "rpltstbr", "state": "up", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [003] #0, state:up persistent_state:present, 'rpltstbr': update connection rpltstbr, 19accc22-6d88-4131-bb4c-c8b05397a153 [004] #0, state:up persistent_state:present, 'rpltstbr': up connection rpltstbr, 19accc22-6d88-4131-bb4c-c8b05397a153 (is-modified) [005] #0, state:up persistent_state:present, 'rpltstbr': connection reapplied 26764 1726882735.16140: no more pending results, returning what we have 26764 1726882735.16143: results queue empty 26764 1726882735.16144: checking for any_errors_fatal 26764 1726882735.16150: done checking for any_errors_fatal 26764 1726882735.16151: checking for max_fail_percentage 26764 1726882735.16153: done checking for max_fail_percentage 26764 1726882735.16154: checking to see if all hosts have failed and the running result is not ok 26764 1726882735.16154: done checking to see if all hosts have failed 26764 1726882735.16155: getting the remaining hosts for this loop 26764 1726882735.16157: done getting the remaining hosts for this loop 26764 1726882735.16160: getting the next task for host managed_node2 26764 1726882735.16170: done getting next task for host managed_node2 26764 1726882735.16173: ^ task is: TASK: Show test_module_run 26764 1726882735.16175: ^ state is: HOST STATE: block=5, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882735.16178: getting variables 26764 1726882735.16179: in VariableManager get_vars() 26764 1726882735.16212: Calling all_inventory to load vars for managed_node2 26764 1726882735.16214: Calling groups_inventory to load vars for managed_node2 26764 1726882735.16216: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882735.16226: Calling all_plugins_play to load vars for managed_node2 26764 1726882735.16229: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882735.16232: Calling groups_plugins_play to load vars for managed_node2 26764 1726882735.17790: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882735.19510: done with get_vars() 26764 1726882735.19532: done getting variables 26764 1726882735.19593: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show test_module_run] **************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_reapply.yml:58 Friday 20 September 2024 21:38:55 -0400 (0:00:00.463) 0:00:21.137 ****** 26764 1726882735.19618: entering _queue_task() for managed_node2/debug 26764 1726882735.19884: worker is 1 (out of 1 available) 26764 1726882735.19897: exiting _queue_task() for managed_node2/debug 26764 1726882735.19909: done queuing things up, now waiting for results queue to drain 26764 1726882735.19910: waiting for pending results... 26764 1726882735.20205: running TaskExecutor() for managed_node2/TASK: Show test_module_run 26764 1726882735.20312: in run() - task 0e448fcc-3ce9-9875-c9a3-00000000002e 26764 1726882735.20333: variable 'ansible_search_path' from source: unknown 26764 1726882735.20384: calling self._execute() 26764 1726882735.20488: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882735.20501: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882735.20517: variable 'omit' from source: magic vars 26764 1726882735.20883: variable 'ansible_distribution_major_version' from source: facts 26764 1726882735.20906: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882735.20917: variable 'omit' from source: magic vars 26764 1726882735.20949: variable 'omit' from source: magic vars 26764 1726882735.20989: variable 'omit' from source: magic vars 26764 1726882735.21038: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26764 1726882735.21082: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26764 1726882735.21107: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26764 1726882735.21133: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882735.21151: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882735.21190: variable 'inventory_hostname' from source: host vars for 'managed_node2' 26764 1726882735.21199: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882735.21207: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882735.21319: Set connection var ansible_shell_executable to /bin/sh 26764 1726882735.21332: Set connection var ansible_shell_type to sh 26764 1726882735.21349: Set connection var ansible_module_compression to ZIP_DEFLATED 26764 1726882735.21359: Set connection var ansible_timeout to 10 26764 1726882735.21375: Set connection var ansible_connection to ssh 26764 1726882735.21387: Set connection var ansible_pipelining to False 26764 1726882735.21413: variable 'ansible_shell_executable' from source: unknown 26764 1726882735.21421: variable 'ansible_connection' from source: unknown 26764 1726882735.21428: variable 'ansible_module_compression' from source: unknown 26764 1726882735.21440: variable 'ansible_shell_type' from source: unknown 26764 1726882735.21447: variable 'ansible_shell_executable' from source: unknown 26764 1726882735.21454: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882735.21461: variable 'ansible_pipelining' from source: unknown 26764 1726882735.21474: variable 'ansible_timeout' from source: unknown 26764 1726882735.21482: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882735.21632: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 26764 1726882735.21650: variable 'omit' from source: magic vars 26764 1726882735.21673: starting attempt loop 26764 1726882735.21682: running the handler 26764 1726882735.21735: variable 'test_module_run' from source: set_fact 26764 1726882735.21822: variable 'test_module_run' from source: set_fact 26764 1726882735.21953: handler run complete 26764 1726882735.21997: attempt loop complete, returning result 26764 1726882735.22004: _execute() done 26764 1726882735.22011: dumping result to json 26764 1726882735.22020: done dumping result, returning 26764 1726882735.22032: done running TaskExecutor() for managed_node2/TASK: Show test_module_run [0e448fcc-3ce9-9875-c9a3-00000000002e] 26764 1726882735.22042: sending task result for task 0e448fcc-3ce9-9875-c9a3-00000000002e ok: [managed_node2] => { "test_module_run": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "# Ansible managed test header", "connections": [ { "ip": { "address": [ "192.0.2.72/31" ], "auto6": false, "dhcp4": false }, "name": "rpltstbr", "state": "up", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[003] #0, state:up persistent_state:present, 'rpltstbr': update connection rpltstbr, 19accc22-6d88-4131-bb4c-c8b05397a153\n[004] #0, state:up persistent_state:present, 'rpltstbr': up connection rpltstbr, 19accc22-6d88-4131-bb4c-c8b05397a153 (is-modified)\n[005] #0, state:up persistent_state:present, 'rpltstbr': connection reapplied\n", "stderr_lines": [ "[003] #0, state:up persistent_state:present, 'rpltstbr': update connection rpltstbr, 19accc22-6d88-4131-bb4c-c8b05397a153", "[004] #0, state:up persistent_state:present, 'rpltstbr': up connection rpltstbr, 19accc22-6d88-4131-bb4c-c8b05397a153 (is-modified)", "[005] #0, state:up persistent_state:present, 'rpltstbr': connection reapplied" ] } } 26764 1726882735.22223: no more pending results, returning what we have 26764 1726882735.22228: results queue empty 26764 1726882735.22229: checking for any_errors_fatal 26764 1726882735.22238: done checking for any_errors_fatal 26764 1726882735.22239: checking for max_fail_percentage 26764 1726882735.22240: done checking for max_fail_percentage 26764 1726882735.22242: checking to see if all hosts have failed and the running result is not ok 26764 1726882735.22242: done checking to see if all hosts have failed 26764 1726882735.22243: getting the remaining hosts for this loop 26764 1726882735.22245: done getting the remaining hosts for this loop 26764 1726882735.22249: getting the next task for host managed_node2 26764 1726882735.22256: done getting next task for host managed_node2 26764 1726882735.22259: ^ task is: TASK: Assert that reapply is found in log output 26764 1726882735.22261: ^ state is: HOST STATE: block=5, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882735.22267: getting variables 26764 1726882735.22270: in VariableManager get_vars() 26764 1726882735.22307: Calling all_inventory to load vars for managed_node2 26764 1726882735.22310: Calling groups_inventory to load vars for managed_node2 26764 1726882735.22313: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882735.22325: Calling all_plugins_play to load vars for managed_node2 26764 1726882735.22328: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882735.22331: Calling groups_plugins_play to load vars for managed_node2 26764 1726882735.23284: done sending task result for task 0e448fcc-3ce9-9875-c9a3-00000000002e 26764 1726882735.23287: WORKER PROCESS EXITING 26764 1726882735.23967: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882735.25623: done with get_vars() 26764 1726882735.25643: done getting variables 26764 1726882735.25702: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that reapply is found in log output] ****************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_reapply.yml:61 Friday 20 September 2024 21:38:55 -0400 (0:00:00.061) 0:00:21.198 ****** 26764 1726882735.25728: entering _queue_task() for managed_node2/assert 26764 1726882735.25968: worker is 1 (out of 1 available) 26764 1726882735.25979: exiting _queue_task() for managed_node2/assert 26764 1726882735.25991: done queuing things up, now waiting for results queue to drain 26764 1726882735.25992: waiting for pending results... 26764 1726882735.26248: running TaskExecutor() for managed_node2/TASK: Assert that reapply is found in log output 26764 1726882735.26343: in run() - task 0e448fcc-3ce9-9875-c9a3-00000000002f 26764 1726882735.26361: variable 'ansible_search_path' from source: unknown 26764 1726882735.26405: calling self._execute() 26764 1726882735.26505: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882735.26517: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882735.26533: variable 'omit' from source: magic vars 26764 1726882735.26888: variable 'ansible_distribution_major_version' from source: facts 26764 1726882735.26904: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882735.26915: variable 'omit' from source: magic vars 26764 1726882735.26946: variable 'omit' from source: magic vars 26764 1726882735.26993: variable 'omit' from source: magic vars 26764 1726882735.27037: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26764 1726882735.27081: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26764 1726882735.27107: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26764 1726882735.27125: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882735.27139: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882735.27172: variable 'inventory_hostname' from source: host vars for 'managed_node2' 26764 1726882735.27179: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882735.27184: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882735.27292: Set connection var ansible_shell_executable to /bin/sh 26764 1726882735.27301: Set connection var ansible_shell_type to sh 26764 1726882735.27316: Set connection var ansible_module_compression to ZIP_DEFLATED 26764 1726882735.27323: Set connection var ansible_timeout to 10 26764 1726882735.27330: Set connection var ansible_connection to ssh 26764 1726882735.27338: Set connection var ansible_pipelining to False 26764 1726882735.27361: variable 'ansible_shell_executable' from source: unknown 26764 1726882735.27374: variable 'ansible_connection' from source: unknown 26764 1726882735.27380: variable 'ansible_module_compression' from source: unknown 26764 1726882735.27385: variable 'ansible_shell_type' from source: unknown 26764 1726882735.27390: variable 'ansible_shell_executable' from source: unknown 26764 1726882735.27394: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882735.27400: variable 'ansible_pipelining' from source: unknown 26764 1726882735.27404: variable 'ansible_timeout' from source: unknown 26764 1726882735.27411: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882735.27546: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 26764 1726882735.27561: variable 'omit' from source: magic vars 26764 1726882735.27575: starting attempt loop 26764 1726882735.27581: running the handler 26764 1726882735.27725: variable 'test_module_run' from source: set_fact 26764 1726882735.27749: Evaluated conditional ('connection reapplied' in test_module_run.stderr): True 26764 1726882735.27758: handler run complete 26764 1726882735.27782: attempt loop complete, returning result 26764 1726882735.27789: _execute() done 26764 1726882735.27795: dumping result to json 26764 1726882735.27801: done dumping result, returning 26764 1726882735.27811: done running TaskExecutor() for managed_node2/TASK: Assert that reapply is found in log output [0e448fcc-3ce9-9875-c9a3-00000000002f] 26764 1726882735.27819: sending task result for task 0e448fcc-3ce9-9875-c9a3-00000000002f 26764 1726882735.27921: done sending task result for task 0e448fcc-3ce9-9875-c9a3-00000000002f 26764 1726882735.27928: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 26764 1726882735.28001: no more pending results, returning what we have 26764 1726882735.28004: results queue empty 26764 1726882735.28005: checking for any_errors_fatal 26764 1726882735.28013: done checking for any_errors_fatal 26764 1726882735.28014: checking for max_fail_percentage 26764 1726882735.28016: done checking for max_fail_percentage 26764 1726882735.28017: checking to see if all hosts have failed and the running result is not ok 26764 1726882735.28018: done checking to see if all hosts have failed 26764 1726882735.28019: getting the remaining hosts for this loop 26764 1726882735.28020: done getting the remaining hosts for this loop 26764 1726882735.28024: getting the next task for host managed_node2 26764 1726882735.28033: done getting next task for host managed_node2 26764 1726882735.28036: ^ task is: TASK: Deactivate the connection and remove the connection profile 26764 1726882735.28039: ^ state is: HOST STATE: block=5, task=10, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 26764 1726882735.28042: getting variables 26764 1726882735.28044: in VariableManager get_vars() 26764 1726882735.28085: Calling all_inventory to load vars for managed_node2 26764 1726882735.28088: Calling groups_inventory to load vars for managed_node2 26764 1726882735.28090: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882735.28102: Calling all_plugins_play to load vars for managed_node2 26764 1726882735.28105: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882735.28108: Calling groups_plugins_play to load vars for managed_node2 26764 1726882735.33856: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882735.35485: done with get_vars() 26764 1726882735.35508: done getting variables TASK [Deactivate the connection and remove the connection profile] ************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_reapply.yml:72 Friday 20 September 2024 21:38:55 -0400 (0:00:00.098) 0:00:21.297 ****** 26764 1726882735.35579: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 26764 1726882735.35902: worker is 1 (out of 1 available) 26764 1726882735.35915: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 26764 1726882735.35927: done queuing things up, now waiting for results queue to drain 26764 1726882735.35928: waiting for pending results... 26764 1726882735.36200: running TaskExecutor() for managed_node2/TASK: Deactivate the connection and remove the connection profile 26764 1726882735.36316: in run() - task 0e448fcc-3ce9-9875-c9a3-000000000031 26764 1726882735.36335: variable 'ansible_search_path' from source: unknown 26764 1726882735.36382: calling self._execute() 26764 1726882735.36484: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882735.36498: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882735.36515: variable 'omit' from source: magic vars 26764 1726882735.36886: variable 'ansible_distribution_major_version' from source: facts 26764 1726882735.36905: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882735.36921: variable 'omit' from source: magic vars 26764 1726882735.36961: variable 'omit' from source: magic vars 26764 1726882735.36999: variable 'interface' from source: play vars 26764 1726882735.37078: variable 'interface' from source: play vars 26764 1726882735.37101: variable 'omit' from source: magic vars 26764 1726882735.37150: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26764 1726882735.37193: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26764 1726882735.37218: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26764 1726882735.37244: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882735.37261: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882735.37300: variable 'inventory_hostname' from source: host vars for 'managed_node2' 26764 1726882735.37309: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882735.37318: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882735.37428: Set connection var ansible_shell_executable to /bin/sh 26764 1726882735.37436: Set connection var ansible_shell_type to sh 26764 1726882735.37454: Set connection var ansible_module_compression to ZIP_DEFLATED 26764 1726882735.37472: Set connection var ansible_timeout to 10 26764 1726882735.37485: Set connection var ansible_connection to ssh 26764 1726882735.37497: Set connection var ansible_pipelining to False 26764 1726882735.37522: variable 'ansible_shell_executable' from source: unknown 26764 1726882735.37530: variable 'ansible_connection' from source: unknown 26764 1726882735.37538: variable 'ansible_module_compression' from source: unknown 26764 1726882735.37545: variable 'ansible_shell_type' from source: unknown 26764 1726882735.37552: variable 'ansible_shell_executable' from source: unknown 26764 1726882735.37558: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882735.37575: variable 'ansible_pipelining' from source: unknown 26764 1726882735.37584: variable 'ansible_timeout' from source: unknown 26764 1726882735.37593: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882735.37778: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 26764 1726882735.37800: variable 'omit' from source: magic vars 26764 1726882735.37811: starting attempt loop 26764 1726882735.37818: running the handler 26764 1726882735.37836: _low_level_execute_command(): starting 26764 1726882735.37849: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 26764 1726882735.38627: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26764 1726882735.38643: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882735.38663: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882735.38691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882735.38735: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882735.38748: stderr chunk (state=3): >>>debug2: match not found <<< 26764 1726882735.38763: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882735.38789: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882735.38802: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882735.38815: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26764 1726882735.38827: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882735.38843: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882735.38860: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882735.38882: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882735.38895: stderr chunk (state=3): >>>debug2: match found <<< 26764 1726882735.38911: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882735.38990: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882735.39010: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882735.39026: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882735.39167: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882735.40825: stdout chunk (state=3): >>>/root <<< 26764 1726882735.40937: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882735.41038: stderr chunk (state=3): >>><<< 26764 1726882735.41056: stdout chunk (state=3): >>><<< 26764 1726882735.41187: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882735.41191: _low_level_execute_command(): starting 26764 1726882735.41194: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882735.410963-27655-67830053892073 `" && echo ansible-tmp-1726882735.410963-27655-67830053892073="` echo /root/.ansible/tmp/ansible-tmp-1726882735.410963-27655-67830053892073 `" ) && sleep 0' 26764 1726882735.41800: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26764 1726882735.41815: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882735.41836: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882735.41858: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882735.41903: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882735.41917: stderr chunk (state=3): >>>debug2: match not found <<< 26764 1726882735.41931: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882735.41957: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882735.41976: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882735.41988: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26764 1726882735.42001: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882735.42015: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882735.42035: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882735.42055: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882735.42071: stderr chunk (state=3): >>>debug2: match found <<< 26764 1726882735.42091: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882735.42173: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882735.42197: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882735.42215: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882735.42345: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882735.44243: stdout chunk (state=3): >>>ansible-tmp-1726882735.410963-27655-67830053892073=/root/.ansible/tmp/ansible-tmp-1726882735.410963-27655-67830053892073 <<< 26764 1726882735.44360: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882735.44423: stderr chunk (state=3): >>><<< 26764 1726882735.44436: stdout chunk (state=3): >>><<< 26764 1726882735.44453: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882735.410963-27655-67830053892073=/root/.ansible/tmp/ansible-tmp-1726882735.410963-27655-67830053892073 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882735.44498: variable 'ansible_module_compression' from source: unknown 26764 1726882735.44624: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-26764trh16hvb/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 26764 1726882735.44627: variable 'ansible_facts' from source: unknown 26764 1726882735.44704: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882735.410963-27655-67830053892073/AnsiballZ_network_connections.py 26764 1726882735.44846: Sending initial data 26764 1726882735.44850: Sent initial data (166 bytes) 26764 1726882735.45779: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882735.45782: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882735.45785: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882735.45787: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882735.45789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882735.45791: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882735.45842: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882735.45850: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882735.45948: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882735.47659: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 26764 1726882735.47758: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 26764 1726882735.47881: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-26764trh16hvb/tmpvc6ebe0o /root/.ansible/tmp/ansible-tmp-1726882735.410963-27655-67830053892073/AnsiballZ_network_connections.py <<< 26764 1726882735.47955: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 26764 1726882735.49729: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882735.49842: stderr chunk (state=3): >>><<< 26764 1726882735.49845: stdout chunk (state=3): >>><<< 26764 1726882735.49866: done transferring module to remote 26764 1726882735.49884: _low_level_execute_command(): starting 26764 1726882735.49887: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882735.410963-27655-67830053892073/ /root/.ansible/tmp/ansible-tmp-1726882735.410963-27655-67830053892073/AnsiballZ_network_connections.py && sleep 0' 26764 1726882735.50484: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26764 1726882735.50497: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882735.50506: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882735.50516: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882735.50550: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882735.50557: stderr chunk (state=3): >>>debug2: match not found <<< 26764 1726882735.50573: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882735.50587: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882735.50594: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882735.50602: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26764 1726882735.50610: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882735.50625: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882735.50630: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882735.50637: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882735.50644: stderr chunk (state=3): >>>debug2: match found <<< 26764 1726882735.50653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882735.50742: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882735.50750: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882735.50754: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882735.50886: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882735.52617: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882735.52680: stderr chunk (state=3): >>><<< 26764 1726882735.52683: stdout chunk (state=3): >>><<< 26764 1726882735.52698: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882735.52704: _low_level_execute_command(): starting 26764 1726882735.52706: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882735.410963-27655-67830053892073/AnsiballZ_network_connections.py && sleep 0' 26764 1726882735.53283: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26764 1726882735.53292: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882735.53302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882735.53316: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882735.53352: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882735.53359: stderr chunk (state=3): >>>debug2: match not found <<< 26764 1726882735.53375: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882735.53394: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882735.53406: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882735.53416: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26764 1726882735.53428: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882735.53442: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882735.53457: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882735.53475: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882735.53491: stderr chunk (state=3): >>>debug2: match found <<< 26764 1726882735.53504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882735.53584: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882735.53605: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882735.53622: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882735.53755: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882735.71823: stdout chunk (state=3): >>> {"failed": true, "msg": "missing required arguments: __header", "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "rpltstbr", "persistent_state": "absent", "state": "down"}], "ignore_errors": false, "force_state_change": false, "__debug_flags": "", "__header": null}}} <<< 26764 1726882735.73373: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.11.158 closed. <<< 26764 1726882735.73377: stdout chunk (state=3): >>><<< 26764 1726882735.73380: stderr chunk (state=3): >>><<< 26764 1726882735.73383: _low_level_execute_command() done: rc=1, stdout= {"failed": true, "msg": "missing required arguments: __header", "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "rpltstbr", "persistent_state": "absent", "state": "down"}], "ignore_errors": false, "force_state_change": false, "__debug_flags": "", "__header": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.11.158 closed. 26764 1726882735.73387: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'rpltstbr', 'persistent_state': 'absent', 'state': 'down'}], '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882735.410963-27655-67830053892073/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 26764 1726882735.73393: _low_level_execute_command(): starting 26764 1726882735.73395: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882735.410963-27655-67830053892073/ > /dev/null 2>&1 && sleep 0' 26764 1726882735.73675: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26764 1726882735.73680: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882735.73682: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882735.73684: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882735.73772: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882735.73775: stderr chunk (state=3): >>>debug2: match not found <<< 26764 1726882735.73779: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882735.73782: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882735.73784: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882735.73786: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26764 1726882735.73789: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882735.73791: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882735.73794: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882735.73797: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882735.73799: stderr chunk (state=3): >>>debug2: match found <<< 26764 1726882735.73801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882735.73921: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882735.73928: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882735.73931: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882735.74041: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882735.75834: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882735.75875: stderr chunk (state=3): >>><<< 26764 1726882735.75878: stdout chunk (state=3): >>><<< 26764 1726882735.75896: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882735.75902: handler run complete 26764 1726882735.75917: attempt loop complete, returning result 26764 1726882735.75919: _execute() done 26764 1726882735.75925: dumping result to json 26764 1726882735.75927: done dumping result, returning 26764 1726882735.75936: done running TaskExecutor() for managed_node2/TASK: Deactivate the connection and remove the connection profile [0e448fcc-3ce9-9875-c9a3-000000000031] 26764 1726882735.75940: sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000031 26764 1726882735.76038: done sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000031 26764 1726882735.76040: WORKER PROCESS EXITING fatal: [managed_node2]: FAILED! => { "changed": false } MSG: missing required arguments: __header ...ignoring 26764 1726882735.76101: no more pending results, returning what we have 26764 1726882735.76104: results queue empty 26764 1726882735.76105: checking for any_errors_fatal 26764 1726882735.76112: done checking for any_errors_fatal 26764 1726882735.76113: checking for max_fail_percentage 26764 1726882735.76115: done checking for max_fail_percentage 26764 1726882735.76116: checking to see if all hosts have failed and the running result is not ok 26764 1726882735.76116: done checking to see if all hosts have failed 26764 1726882735.76117: getting the remaining hosts for this loop 26764 1726882735.76118: done getting the remaining hosts for this loop 26764 1726882735.76121: getting the next task for host managed_node2 26764 1726882735.76128: done getting next task for host managed_node2 26764 1726882735.76131: ^ task is: TASK: Delete the device '{{ interface }}' 26764 1726882735.76133: ^ state is: HOST STATE: block=5, task=10, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 26764 1726882735.76137: getting variables 26764 1726882735.76139: in VariableManager get_vars() 26764 1726882735.76182: Calling all_inventory to load vars for managed_node2 26764 1726882735.76185: Calling groups_inventory to load vars for managed_node2 26764 1726882735.76187: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882735.76197: Calling all_plugins_play to load vars for managed_node2 26764 1726882735.76199: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882735.76201: Calling groups_plugins_play to load vars for managed_node2 26764 1726882735.77005: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882735.78520: done with get_vars() 26764 1726882735.78551: done getting variables 26764 1726882735.78599: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 26764 1726882735.78691: variable 'interface' from source: play vars TASK [Delete the device 'rpltstbr'] ******************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_reapply.yml:79 Friday 20 September 2024 21:38:55 -0400 (0:00:00.431) 0:00:21.728 ****** 26764 1726882735.78713: entering _queue_task() for managed_node2/command 26764 1726882735.78922: worker is 1 (out of 1 available) 26764 1726882735.78934: exiting _queue_task() for managed_node2/command 26764 1726882735.78946: done queuing things up, now waiting for results queue to drain 26764 1726882735.78947: waiting for pending results... 26764 1726882735.79114: running TaskExecutor() for managed_node2/TASK: Delete the device 'rpltstbr' 26764 1726882735.79180: in run() - task 0e448fcc-3ce9-9875-c9a3-000000000032 26764 1726882735.79190: variable 'ansible_search_path' from source: unknown 26764 1726882735.79218: calling self._execute() 26764 1726882735.79295: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882735.79299: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882735.79310: variable 'omit' from source: magic vars 26764 1726882735.79575: variable 'ansible_distribution_major_version' from source: facts 26764 1726882735.79585: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882735.79591: variable 'omit' from source: magic vars 26764 1726882735.79618: variable 'omit' from source: magic vars 26764 1726882735.79686: variable 'interface' from source: play vars 26764 1726882735.79699: variable 'omit' from source: magic vars 26764 1726882735.79732: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26764 1726882735.79758: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26764 1726882735.79782: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26764 1726882735.79794: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882735.79803: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882735.79827: variable 'inventory_hostname' from source: host vars for 'managed_node2' 26764 1726882735.79830: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882735.79833: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882735.79905: Set connection var ansible_shell_executable to /bin/sh 26764 1726882735.79908: Set connection var ansible_shell_type to sh 26764 1726882735.79915: Set connection var ansible_module_compression to ZIP_DEFLATED 26764 1726882735.79920: Set connection var ansible_timeout to 10 26764 1726882735.79927: Set connection var ansible_connection to ssh 26764 1726882735.79934: Set connection var ansible_pipelining to False 26764 1726882735.79949: variable 'ansible_shell_executable' from source: unknown 26764 1726882735.79952: variable 'ansible_connection' from source: unknown 26764 1726882735.79959: variable 'ansible_module_compression' from source: unknown 26764 1726882735.79961: variable 'ansible_shell_type' from source: unknown 26764 1726882735.79965: variable 'ansible_shell_executable' from source: unknown 26764 1726882735.79971: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882735.79975: variable 'ansible_pipelining' from source: unknown 26764 1726882735.79977: variable 'ansible_timeout' from source: unknown 26764 1726882735.79982: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882735.80081: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 26764 1726882735.80088: variable 'omit' from source: magic vars 26764 1726882735.80094: starting attempt loop 26764 1726882735.80097: running the handler 26764 1726882735.80109: _low_level_execute_command(): starting 26764 1726882735.80116: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 26764 1726882735.80603: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882735.80620: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882735.80633: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882735.80647: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882735.80696: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882735.80708: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882735.80816: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882735.82417: stdout chunk (state=3): >>>/root <<< 26764 1726882735.82519: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882735.82561: stderr chunk (state=3): >>><<< 26764 1726882735.82566: stdout chunk (state=3): >>><<< 26764 1726882735.82587: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882735.82599: _low_level_execute_command(): starting 26764 1726882735.82608: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882735.8258631-27676-261604934998245 `" && echo ansible-tmp-1726882735.8258631-27676-261604934998245="` echo /root/.ansible/tmp/ansible-tmp-1726882735.8258631-27676-261604934998245 `" ) && sleep 0' 26764 1726882735.83024: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882735.83043: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882735.83059: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration <<< 26764 1726882735.83082: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882735.83124: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882735.83141: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882735.83238: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882735.85098: stdout chunk (state=3): >>>ansible-tmp-1726882735.8258631-27676-261604934998245=/root/.ansible/tmp/ansible-tmp-1726882735.8258631-27676-261604934998245 <<< 26764 1726882735.85204: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882735.85246: stderr chunk (state=3): >>><<< 26764 1726882735.85249: stdout chunk (state=3): >>><<< 26764 1726882735.85262: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882735.8258631-27676-261604934998245=/root/.ansible/tmp/ansible-tmp-1726882735.8258631-27676-261604934998245 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882735.85292: variable 'ansible_module_compression' from source: unknown 26764 1726882735.85330: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-26764trh16hvb/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 26764 1726882735.85356: variable 'ansible_facts' from source: unknown 26764 1726882735.85418: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882735.8258631-27676-261604934998245/AnsiballZ_command.py 26764 1726882735.85518: Sending initial data 26764 1726882735.85527: Sent initial data (156 bytes) 26764 1726882735.86158: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882735.86162: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882735.86192: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 26764 1726882735.86197: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882735.86246: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882735.86249: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882735.86353: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882735.88106: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 26764 1726882735.88200: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 26764 1726882735.88298: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-26764trh16hvb/tmppo67xr9w /root/.ansible/tmp/ansible-tmp-1726882735.8258631-27676-261604934998245/AnsiballZ_command.py <<< 26764 1726882735.88391: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 26764 1726882735.89407: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882735.89519: stderr chunk (state=3): >>><<< 26764 1726882735.89522: stdout chunk (state=3): >>><<< 26764 1726882735.89539: done transferring module to remote 26764 1726882735.89548: _low_level_execute_command(): starting 26764 1726882735.89553: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882735.8258631-27676-261604934998245/ /root/.ansible/tmp/ansible-tmp-1726882735.8258631-27676-261604934998245/AnsiballZ_command.py && sleep 0' 26764 1726882735.90014: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882735.90018: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882735.90051: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882735.90054: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882735.90058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882735.90105: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882735.90117: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882735.90223: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882735.91972: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882735.92024: stderr chunk (state=3): >>><<< 26764 1726882735.92028: stdout chunk (state=3): >>><<< 26764 1726882735.92045: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882735.92052: _low_level_execute_command(): starting 26764 1726882735.92055: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882735.8258631-27676-261604934998245/AnsiballZ_command.py && sleep 0' 26764 1726882735.92496: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882735.92508: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882735.92526: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 26764 1726882735.92544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882735.92592: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882735.92603: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882735.92714: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882736.08025: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "rpltstbr"], "start": "2024-09-20 21:38:56.056297", "end": "2024-09-20 21:38:56.076949", "delta": "0:00:00.020652", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del \"rpltstbr\"", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 26764 1726882736.11690: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 26764 1726882736.11735: stderr chunk (state=3): >>><<< 26764 1726882736.11738: stdout chunk (state=3): >>><<< 26764 1726882736.11771: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "rpltstbr"], "start": "2024-09-20 21:38:56.056297", "end": "2024-09-20 21:38:56.076949", "delta": "0:00:00.020652", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del \"rpltstbr\"", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 26764 1726882736.11894: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del "rpltstbr"', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882735.8258631-27676-261604934998245/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 26764 1726882736.11898: _low_level_execute_command(): starting 26764 1726882736.11901: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882735.8258631-27676-261604934998245/ > /dev/null 2>&1 && sleep 0' 26764 1726882736.12483: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26764 1726882736.12499: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882736.12514: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882736.12534: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882736.12581: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882736.12594: stderr chunk (state=3): >>>debug2: match not found <<< 26764 1726882736.12609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882736.12628: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882736.12641: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882736.12655: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26764 1726882736.12672: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882736.12689: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882736.12705: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882736.12718: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882736.12730: stderr chunk (state=3): >>>debug2: match found <<< 26764 1726882736.12745: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882736.12826: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882736.12850: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882736.12875: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882736.13005: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882736.14851: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882736.14958: stderr chunk (state=3): >>><<< 26764 1726882736.14971: stdout chunk (state=3): >>><<< 26764 1726882736.15175: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882736.15183: handler run complete 26764 1726882736.15186: Evaluated conditional (False): False 26764 1726882736.15188: attempt loop complete, returning result 26764 1726882736.15190: _execute() done 26764 1726882736.15192: dumping result to json 26764 1726882736.15194: done dumping result, returning 26764 1726882736.15196: done running TaskExecutor() for managed_node2/TASK: Delete the device 'rpltstbr' [0e448fcc-3ce9-9875-c9a3-000000000032] 26764 1726882736.15198: sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000032 26764 1726882736.15272: done sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000032 26764 1726882736.15276: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "ip", "link", "del", "rpltstbr" ], "delta": "0:00:00.020652", "end": "2024-09-20 21:38:56.076949", "rc": 0, "start": "2024-09-20 21:38:56.056297" } 26764 1726882736.15343: no more pending results, returning what we have 26764 1726882736.15347: results queue empty 26764 1726882736.15348: checking for any_errors_fatal 26764 1726882736.15358: done checking for any_errors_fatal 26764 1726882736.15359: checking for max_fail_percentage 26764 1726882736.15361: done checking for max_fail_percentage 26764 1726882736.15362: checking to see if all hosts have failed and the running result is not ok 26764 1726882736.15363: done checking to see if all hosts have failed 26764 1726882736.15366: getting the remaining hosts for this loop 26764 1726882736.15367: done getting the remaining hosts for this loop 26764 1726882736.15371: getting the next task for host managed_node2 26764 1726882736.15379: done getting next task for host managed_node2 26764 1726882736.15383: ^ task is: TASK: Verify network state restored to default 26764 1726882736.15388: ^ state is: HOST STATE: block=5, task=10, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 26764 1726882736.15392: getting variables 26764 1726882736.15395: in VariableManager get_vars() 26764 1726882736.15434: Calling all_inventory to load vars for managed_node2 26764 1726882736.15438: Calling groups_inventory to load vars for managed_node2 26764 1726882736.15440: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882736.15452: Calling all_plugins_play to load vars for managed_node2 26764 1726882736.15456: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882736.15459: Calling groups_plugins_play to load vars for managed_node2 26764 1726882736.17228: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882736.18990: done with get_vars() 26764 1726882736.19012: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_reapply.yml:82 Friday 20 September 2024 21:38:56 -0400 (0:00:00.403) 0:00:22.132 ****** 26764 1726882736.19108: entering _queue_task() for managed_node2/include_tasks 26764 1726882736.19411: worker is 1 (out of 1 available) 26764 1726882736.19423: exiting _queue_task() for managed_node2/include_tasks 26764 1726882736.19435: done queuing things up, now waiting for results queue to drain 26764 1726882736.19436: waiting for pending results... 26764 1726882736.19703: running TaskExecutor() for managed_node2/TASK: Verify network state restored to default 26764 1726882736.19826: in run() - task 0e448fcc-3ce9-9875-c9a3-000000000033 26764 1726882736.19846: variable 'ansible_search_path' from source: unknown 26764 1726882736.19898: calling self._execute() 26764 1726882736.20012: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882736.20025: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882736.20042: variable 'omit' from source: magic vars 26764 1726882736.20435: variable 'ansible_distribution_major_version' from source: facts 26764 1726882736.20452: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882736.20467: _execute() done 26764 1726882736.20477: dumping result to json 26764 1726882736.20486: done dumping result, returning 26764 1726882736.20495: done running TaskExecutor() for managed_node2/TASK: Verify network state restored to default [0e448fcc-3ce9-9875-c9a3-000000000033] 26764 1726882736.20506: sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000033 26764 1726882736.20626: done sending task result for task 0e448fcc-3ce9-9875-c9a3-000000000033 26764 1726882736.20633: WORKER PROCESS EXITING 26764 1726882736.20671: no more pending results, returning what we have 26764 1726882736.20677: in VariableManager get_vars() 26764 1726882736.20722: Calling all_inventory to load vars for managed_node2 26764 1726882736.20726: Calling groups_inventory to load vars for managed_node2 26764 1726882736.20728: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882736.20744: Calling all_plugins_play to load vars for managed_node2 26764 1726882736.20748: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882736.20751: Calling groups_plugins_play to load vars for managed_node2 26764 1726882736.22518: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882736.24246: done with get_vars() 26764 1726882736.24268: variable 'ansible_search_path' from source: unknown 26764 1726882736.24283: we have included files to process 26764 1726882736.24284: generating all_blocks data 26764 1726882736.24288: done generating all_blocks data 26764 1726882736.24294: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 26764 1726882736.24295: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 26764 1726882736.24297: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 26764 1726882736.24707: done processing included file 26764 1726882736.24709: iterating over new_blocks loaded from include file 26764 1726882736.24710: in VariableManager get_vars() 26764 1726882736.24726: done with get_vars() 26764 1726882736.24728: filtering new block on tags 26764 1726882736.24768: done filtering new block on tags 26764 1726882736.24771: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed_node2 26764 1726882736.24776: extending task lists for all hosts with included blocks 26764 1726882736.24877: done extending task lists 26764 1726882736.24878: done processing included files 26764 1726882736.24879: results queue empty 26764 1726882736.24880: checking for any_errors_fatal 26764 1726882736.24884: done checking for any_errors_fatal 26764 1726882736.24885: checking for max_fail_percentage 26764 1726882736.24886: done checking for max_fail_percentage 26764 1726882736.24887: checking to see if all hosts have failed and the running result is not ok 26764 1726882736.24888: done checking to see if all hosts have failed 26764 1726882736.24888: getting the remaining hosts for this loop 26764 1726882736.24889: done getting the remaining hosts for this loop 26764 1726882736.24892: getting the next task for host managed_node2 26764 1726882736.24896: done getting next task for host managed_node2 26764 1726882736.24898: ^ task is: TASK: Check routes and DNS 26764 1726882736.24901: ^ state is: HOST STATE: block=5, task=10, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 26764 1726882736.24903: getting variables 26764 1726882736.24904: in VariableManager get_vars() 26764 1726882736.24916: Calling all_inventory to load vars for managed_node2 26764 1726882736.24918: Calling groups_inventory to load vars for managed_node2 26764 1726882736.24920: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882736.24925: Calling all_plugins_play to load vars for managed_node2 26764 1726882736.24927: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882736.24930: Calling groups_plugins_play to load vars for managed_node2 26764 1726882736.26133: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882736.27809: done with get_vars() 26764 1726882736.27831: done getting variables 26764 1726882736.27876: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Friday 20 September 2024 21:38:56 -0400 (0:00:00.087) 0:00:22.220 ****** 26764 1726882736.27905: entering _queue_task() for managed_node2/shell 26764 1726882736.28214: worker is 1 (out of 1 available) 26764 1726882736.28225: exiting _queue_task() for managed_node2/shell 26764 1726882736.28238: done queuing things up, now waiting for results queue to drain 26764 1726882736.28239: waiting for pending results... 26764 1726882736.28518: running TaskExecutor() for managed_node2/TASK: Check routes and DNS 26764 1726882736.28635: in run() - task 0e448fcc-3ce9-9875-c9a3-00000000047c 26764 1726882736.28653: variable 'ansible_search_path' from source: unknown 26764 1726882736.28659: variable 'ansible_search_path' from source: unknown 26764 1726882736.28706: calling self._execute() 26764 1726882736.28807: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882736.28818: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882736.28833: variable 'omit' from source: magic vars 26764 1726882736.29202: variable 'ansible_distribution_major_version' from source: facts 26764 1726882736.29218: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882736.29231: variable 'omit' from source: magic vars 26764 1726882736.29278: variable 'omit' from source: magic vars 26764 1726882736.29309: variable 'omit' from source: magic vars 26764 1726882736.29353: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26764 1726882736.29399: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26764 1726882736.29428: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26764 1726882736.29456: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882736.29480: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882736.29516: variable 'inventory_hostname' from source: host vars for 'managed_node2' 26764 1726882736.29525: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882736.29532: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882736.29645: Set connection var ansible_shell_executable to /bin/sh 26764 1726882736.29655: Set connection var ansible_shell_type to sh 26764 1726882736.29681: Set connection var ansible_module_compression to ZIP_DEFLATED 26764 1726882736.29692: Set connection var ansible_timeout to 10 26764 1726882736.29701: Set connection var ansible_connection to ssh 26764 1726882736.29711: Set connection var ansible_pipelining to False 26764 1726882736.29738: variable 'ansible_shell_executable' from source: unknown 26764 1726882736.29745: variable 'ansible_connection' from source: unknown 26764 1726882736.29752: variable 'ansible_module_compression' from source: unknown 26764 1726882736.29758: variable 'ansible_shell_type' from source: unknown 26764 1726882736.29768: variable 'ansible_shell_executable' from source: unknown 26764 1726882736.29778: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882736.29786: variable 'ansible_pipelining' from source: unknown 26764 1726882736.29793: variable 'ansible_timeout' from source: unknown 26764 1726882736.29801: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882736.29948: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 26764 1726882736.29968: variable 'omit' from source: magic vars 26764 1726882736.29980: starting attempt loop 26764 1726882736.29991: running the handler 26764 1726882736.30005: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 26764 1726882736.30028: _low_level_execute_command(): starting 26764 1726882736.30041: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 26764 1726882736.30832: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26764 1726882736.30848: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882736.30874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882736.30895: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882736.30939: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882736.30952: stderr chunk (state=3): >>>debug2: match not found <<< 26764 1726882736.30975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882736.30995: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882736.31007: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882736.31019: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26764 1726882736.31032: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882736.31046: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882736.31062: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882736.31084: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882736.31096: stderr chunk (state=3): >>>debug2: match found <<< 26764 1726882736.31111: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882736.31190: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882736.31216: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882736.31234: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882736.31373: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882736.33037: stdout chunk (state=3): >>>/root <<< 26764 1726882736.33154: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882736.33236: stderr chunk (state=3): >>><<< 26764 1726882736.33254: stdout chunk (state=3): >>><<< 26764 1726882736.33401: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882736.33404: _low_level_execute_command(): starting 26764 1726882736.33414: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882736.3330367-27688-170187333682599 `" && echo ansible-tmp-1726882736.3330367-27688-170187333682599="` echo /root/.ansible/tmp/ansible-tmp-1726882736.3330367-27688-170187333682599 `" ) && sleep 0' 26764 1726882736.34255: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882736.34259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882736.34296: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 26764 1726882736.34302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882736.34306: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882736.34376: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882736.34382: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882736.34385: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882736.34492: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882736.36367: stdout chunk (state=3): >>>ansible-tmp-1726882736.3330367-27688-170187333682599=/root/.ansible/tmp/ansible-tmp-1726882736.3330367-27688-170187333682599 <<< 26764 1726882736.36578: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882736.36582: stdout chunk (state=3): >>><<< 26764 1726882736.36584: stderr chunk (state=3): >>><<< 26764 1726882736.36871: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882736.3330367-27688-170187333682599=/root/.ansible/tmp/ansible-tmp-1726882736.3330367-27688-170187333682599 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882736.36875: variable 'ansible_module_compression' from source: unknown 26764 1726882736.36877: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-26764trh16hvb/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 26764 1726882736.36879: variable 'ansible_facts' from source: unknown 26764 1726882736.36881: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882736.3330367-27688-170187333682599/AnsiballZ_command.py 26764 1726882736.36984: Sending initial data 26764 1726882736.36987: Sent initial data (156 bytes) 26764 1726882736.38026: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26764 1726882736.38041: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882736.38056: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882736.38081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882736.38127: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882736.38140: stderr chunk (state=3): >>>debug2: match not found <<< 26764 1726882736.38155: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882736.38177: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882736.38194: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882736.38207: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26764 1726882736.38224: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882736.38238: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882736.38255: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882736.38270: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882736.38284: stderr chunk (state=3): >>>debug2: match found <<< 26764 1726882736.38300: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882736.38382: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882736.38405: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882736.38427: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882736.38561: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882736.40331: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 26764 1726882736.40433: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 26764 1726882736.40536: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-26764trh16hvb/tmph1opv_gx /root/.ansible/tmp/ansible-tmp-1726882736.3330367-27688-170187333682599/AnsiballZ_command.py <<< 26764 1726882736.40639: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 26764 1726882736.41967: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882736.42171: stderr chunk (state=3): >>><<< 26764 1726882736.42174: stdout chunk (state=3): >>><<< 26764 1726882736.42176: done transferring module to remote 26764 1726882736.42178: _low_level_execute_command(): starting 26764 1726882736.42181: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882736.3330367-27688-170187333682599/ /root/.ansible/tmp/ansible-tmp-1726882736.3330367-27688-170187333682599/AnsiballZ_command.py && sleep 0' 26764 1726882736.42852: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26764 1726882736.42867: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882736.42882: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882736.42904: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882736.42949: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882736.42961: stderr chunk (state=3): >>>debug2: match not found <<< 26764 1726882736.42978: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882736.42995: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882736.43008: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882736.43023: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26764 1726882736.43036: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882736.43052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882736.43069: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882736.43082: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882736.43093: stderr chunk (state=3): >>>debug2: match found <<< 26764 1726882736.43106: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882736.43192: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882736.43212: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882736.43229: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882736.43366: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882736.45142: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882736.45219: stderr chunk (state=3): >>><<< 26764 1726882736.45229: stdout chunk (state=3): >>><<< 26764 1726882736.45333: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882736.45337: _low_level_execute_command(): starting 26764 1726882736.45340: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882736.3330367-27688-170187333682599/AnsiballZ_command.py && sleep 0' 26764 1726882736.45976: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26764 1726882736.46018: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882736.46033: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882736.46075: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882736.46127: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882736.46139: stderr chunk (state=3): >>>debug2: match not found <<< 26764 1726882736.46154: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882736.46174: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882736.46208: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882736.46231: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26764 1726882736.46244: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882736.46258: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882736.46278: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882736.46291: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882736.46303: stderr chunk (state=3): >>>debug2: match found <<< 26764 1726882736.46328: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882736.46402: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882736.46426: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882736.46452: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882736.46593: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882736.60633: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:4f:68:7a:de:b1 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.11.158/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 3127sec preferred_lft 3127sec\n inet6 fe80::104f:68ff:fe7a:deb1/64 scope link \n valid_lft forever preferred_lft forever\n30: rpltstbr: mtu 1500 qdisc noqueue state DOWN group default qlen 1000\n link/ether 2e:06:5a:d7:92:57 brd ff:ff:ff:ff:ff:ff\n inet 192.0.2.72/31 scope global noprefixroute rpltstbr\n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.11.158 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.11.158 metric 100 \n192.0.2.72/31 dev rpltstbr proto kernel scope link src 192.0.2.72 metric 425 linkdown \nIP -6 ROUTE\n::1 dev lo proto kernel metric 256 pref medium\nfe80::/64 dev eth0 proto kernel metric 256 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 21:38:56.595481", "end": "2024-09-20 21:38:56.604010", "delta": "0:00:00.008529", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 26764 1726882736.61896: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 26764 1726882736.61900: stdout chunk (state=3): >>><<< 26764 1726882736.61902: stderr chunk (state=3): >>><<< 26764 1726882736.62055: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:4f:68:7a:de:b1 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.11.158/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 3127sec preferred_lft 3127sec\n inet6 fe80::104f:68ff:fe7a:deb1/64 scope link \n valid_lft forever preferred_lft forever\n30: rpltstbr: mtu 1500 qdisc noqueue state DOWN group default qlen 1000\n link/ether 2e:06:5a:d7:92:57 brd ff:ff:ff:ff:ff:ff\n inet 192.0.2.72/31 scope global noprefixroute rpltstbr\n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.11.158 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.11.158 metric 100 \n192.0.2.72/31 dev rpltstbr proto kernel scope link src 192.0.2.72 metric 425 linkdown \nIP -6 ROUTE\n::1 dev lo proto kernel metric 256 pref medium\nfe80::/64 dev eth0 proto kernel metric 256 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 21:38:56.595481", "end": "2024-09-20 21:38:56.604010", "delta": "0:00:00.008529", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 26764 1726882736.62059: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882736.3330367-27688-170187333682599/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 26764 1726882736.62062: _low_level_execute_command(): starting 26764 1726882736.62067: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882736.3330367-27688-170187333682599/ > /dev/null 2>&1 && sleep 0' 26764 1726882736.63402: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26764 1726882736.63417: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882736.63432: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882736.63450: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882736.63508: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882736.63520: stderr chunk (state=3): >>>debug2: match not found <<< 26764 1726882736.63598: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882736.63616: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882736.63628: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882736.63639: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26764 1726882736.63650: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882736.63662: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882736.63680: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882736.63696: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882736.63713: stderr chunk (state=3): >>>debug2: match found <<< 26764 1726882736.63726: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882736.63796: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882736.63937: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882736.63955: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882736.64093: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882736.65990: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882736.65994: stdout chunk (state=3): >>><<< 26764 1726882736.65996: stderr chunk (state=3): >>><<< 26764 1726882736.66269: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882736.66278: handler run complete 26764 1726882736.66281: Evaluated conditional (False): False 26764 1726882736.66283: attempt loop complete, returning result 26764 1726882736.66285: _execute() done 26764 1726882736.66287: dumping result to json 26764 1726882736.66288: done dumping result, returning 26764 1726882736.66290: done running TaskExecutor() for managed_node2/TASK: Check routes and DNS [0e448fcc-3ce9-9875-c9a3-00000000047c] 26764 1726882736.66292: sending task result for task 0e448fcc-3ce9-9875-c9a3-00000000047c 26764 1726882736.66379: done sending task result for task 0e448fcc-3ce9-9875-c9a3-00000000047c 26764 1726882736.66382: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.008529", "end": "2024-09-20 21:38:56.604010", "rc": 0, "start": "2024-09-20 21:38:56.595481" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 12:4f:68:7a:de:b1 brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.11.158/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0 valid_lft 3127sec preferred_lft 3127sec inet6 fe80::104f:68ff:fe7a:deb1/64 scope link valid_lft forever preferred_lft forever 30: rpltstbr: mtu 1500 qdisc noqueue state DOWN group default qlen 1000 link/ether 2e:06:5a:d7:92:57 brd ff:ff:ff:ff:ff:ff inet 192.0.2.72/31 scope global noprefixroute rpltstbr valid_lft forever preferred_lft forever IP ROUTE default via 10.31.8.1 dev eth0 proto dhcp src 10.31.11.158 metric 100 10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.11.158 metric 100 192.0.2.72/31 dev rpltstbr proto kernel scope link src 192.0.2.72 metric 425 linkdown IP -6 ROUTE ::1 dev lo proto kernel metric 256 pref medium fe80::/64 dev eth0 proto kernel metric 256 pref medium RESOLV # Generated by NetworkManager search us-east-1.aws.redhat.com nameserver 10.29.169.13 nameserver 10.29.170.12 nameserver 10.2.32.1 26764 1726882736.66453: no more pending results, returning what we have 26764 1726882736.66457: results queue empty 26764 1726882736.66459: checking for any_errors_fatal 26764 1726882736.66460: done checking for any_errors_fatal 26764 1726882736.66461: checking for max_fail_percentage 26764 1726882736.66463: done checking for max_fail_percentage 26764 1726882736.66466: checking to see if all hosts have failed and the running result is not ok 26764 1726882736.66467: done checking to see if all hosts have failed 26764 1726882736.66468: getting the remaining hosts for this loop 26764 1726882736.66469: done getting the remaining hosts for this loop 26764 1726882736.66473: getting the next task for host managed_node2 26764 1726882736.66482: done getting next task for host managed_node2 26764 1726882736.66485: ^ task is: TASK: Verify DNS and network connectivity 26764 1726882736.66488: ^ state is: HOST STATE: block=5, task=10, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 26764 1726882736.66492: getting variables 26764 1726882736.66494: in VariableManager get_vars() 26764 1726882736.66535: Calling all_inventory to load vars for managed_node2 26764 1726882736.66538: Calling groups_inventory to load vars for managed_node2 26764 1726882736.66541: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882736.66552: Calling all_plugins_play to load vars for managed_node2 26764 1726882736.66556: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882736.66559: Calling groups_plugins_play to load vars for managed_node2 26764 1726882736.68270: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882736.70102: done with get_vars() 26764 1726882736.70123: done getting variables 26764 1726882736.70191: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Friday 20 September 2024 21:38:56 -0400 (0:00:00.423) 0:00:22.643 ****** 26764 1726882736.70219: entering _queue_task() for managed_node2/shell 26764 1726882736.70521: worker is 1 (out of 1 available) 26764 1726882736.70532: exiting _queue_task() for managed_node2/shell 26764 1726882736.70544: done queuing things up, now waiting for results queue to drain 26764 1726882736.70545: waiting for pending results... 26764 1726882736.70819: running TaskExecutor() for managed_node2/TASK: Verify DNS and network connectivity 26764 1726882736.70942: in run() - task 0e448fcc-3ce9-9875-c9a3-00000000047d 26764 1726882736.70959: variable 'ansible_search_path' from source: unknown 26764 1726882736.70969: variable 'ansible_search_path' from source: unknown 26764 1726882736.71016: calling self._execute() 26764 1726882736.71112: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882736.71129: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882736.71143: variable 'omit' from source: magic vars 26764 1726882736.71505: variable 'ansible_distribution_major_version' from source: facts 26764 1726882736.71521: Evaluated conditional (ansible_distribution_major_version != '6'): True 26764 1726882736.71678: variable 'ansible_facts' from source: unknown 26764 1726882736.72448: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): True 26764 1726882736.72459: variable 'omit' from source: magic vars 26764 1726882736.72513: variable 'omit' from source: magic vars 26764 1726882736.72553: variable 'omit' from source: magic vars 26764 1726882736.72597: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 26764 1726882736.72643: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 26764 1726882736.72667: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 26764 1726882736.72692: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882736.72707: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 26764 1726882736.72748: variable 'inventory_hostname' from source: host vars for 'managed_node2' 26764 1726882736.72756: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882736.72772: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882736.72881: Set connection var ansible_shell_executable to /bin/sh 26764 1726882736.72888: Set connection var ansible_shell_type to sh 26764 1726882736.72901: Set connection var ansible_module_compression to ZIP_DEFLATED 26764 1726882736.72910: Set connection var ansible_timeout to 10 26764 1726882736.72919: Set connection var ansible_connection to ssh 26764 1726882736.72928: Set connection var ansible_pipelining to False 26764 1726882736.72963: variable 'ansible_shell_executable' from source: unknown 26764 1726882736.72975: variable 'ansible_connection' from source: unknown 26764 1726882736.72983: variable 'ansible_module_compression' from source: unknown 26764 1726882736.72990: variable 'ansible_shell_type' from source: unknown 26764 1726882736.72996: variable 'ansible_shell_executable' from source: unknown 26764 1726882736.73002: variable 'ansible_host' from source: host vars for 'managed_node2' 26764 1726882736.73009: variable 'ansible_pipelining' from source: unknown 26764 1726882736.73016: variable 'ansible_timeout' from source: unknown 26764 1726882736.73024: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 26764 1726882736.73179: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 26764 1726882736.73199: variable 'omit' from source: magic vars 26764 1726882736.73210: starting attempt loop 26764 1726882736.73216: running the handler 26764 1726882736.73230: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 26764 1726882736.73252: _low_level_execute_command(): starting 26764 1726882736.73271: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 26764 1726882736.74046: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26764 1726882736.74069: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882736.74084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882736.74102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882736.74144: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882736.74156: stderr chunk (state=3): >>>debug2: match not found <<< 26764 1726882736.74181: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882736.74198: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882736.74210: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882736.74221: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26764 1726882736.74233: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882736.74251: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882736.74272: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882736.74286: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882736.74297: stderr chunk (state=3): >>>debug2: match found <<< 26764 1726882736.74310: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882736.74394: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882736.74415: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882736.74431: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882736.74562: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882736.76209: stdout chunk (state=3): >>>/root <<< 26764 1726882736.76302: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882736.76381: stderr chunk (state=3): >>><<< 26764 1726882736.76384: stdout chunk (state=3): >>><<< 26764 1726882736.76472: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882736.76486: _low_level_execute_command(): starting 26764 1726882736.76489: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882736.763977-27707-47999784613134 `" && echo ansible-tmp-1726882736.763977-27707-47999784613134="` echo /root/.ansible/tmp/ansible-tmp-1726882736.763977-27707-47999784613134 `" ) && sleep 0' 26764 1726882736.77115: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 26764 1726882736.77137: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882736.77152: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882736.77171: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882736.77216: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882736.77236: stderr chunk (state=3): >>>debug2: match not found <<< 26764 1726882736.77257: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882736.77278: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882736.77289: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address <<< 26764 1726882736.77302: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 26764 1726882736.77313: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882736.77325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882736.77345: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882736.77361: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882736.77374: stderr chunk (state=3): >>>debug2: match found <<< 26764 1726882736.77391: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882736.77480: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882736.77503: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882736.77518: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882736.77645: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882736.79518: stdout chunk (state=3): >>>ansible-tmp-1726882736.763977-27707-47999784613134=/root/.ansible/tmp/ansible-tmp-1726882736.763977-27707-47999784613134 <<< 26764 1726882736.79623: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882736.79677: stderr chunk (state=3): >>><<< 26764 1726882736.79685: stdout chunk (state=3): >>><<< 26764 1726882736.79774: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882736.763977-27707-47999784613134=/root/.ansible/tmp/ansible-tmp-1726882736.763977-27707-47999784613134 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882736.79778: variable 'ansible_module_compression' from source: unknown 26764 1726882736.79780: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-26764trh16hvb/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 26764 1726882736.79787: variable 'ansible_facts' from source: unknown 26764 1726882736.79844: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882736.763977-27707-47999784613134/AnsiballZ_command.py 26764 1726882736.79941: Sending initial data 26764 1726882736.79944: Sent initial data (154 bytes) 26764 1726882736.80574: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882736.80580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882736.80589: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882736.80616: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882736.80623: stderr chunk (state=3): >>>debug2: match not found <<< 26764 1726882736.80631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882736.80643: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882736.80647: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882736.80669: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found <<< 26764 1726882736.80674: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882736.80718: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882736.80737: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882736.80740: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882736.80852: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882736.82569: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 26764 1726882736.82659: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 26764 1726882736.82757: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-26764trh16hvb/tmpzdjqhwn3 /root/.ansible/tmp/ansible-tmp-1726882736.763977-27707-47999784613134/AnsiballZ_command.py <<< 26764 1726882736.82849: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 26764 1726882736.83855: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882736.83943: stderr chunk (state=3): >>><<< 26764 1726882736.83947: stdout chunk (state=3): >>><<< 26764 1726882736.83962: done transferring module to remote 26764 1726882736.83974: _low_level_execute_command(): starting 26764 1726882736.83977: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882736.763977-27707-47999784613134/ /root/.ansible/tmp/ansible-tmp-1726882736.763977-27707-47999784613134/AnsiballZ_command.py && sleep 0' 26764 1726882736.84372: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882736.84377: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882736.84419: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882736.84423: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882736.84425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882736.84476: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882736.84480: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882736.84488: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882736.84598: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882736.86342: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882736.86384: stderr chunk (state=3): >>><<< 26764 1726882736.86388: stdout chunk (state=3): >>><<< 26764 1726882736.86402: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882736.86408: _low_level_execute_command(): starting 26764 1726882736.86413: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882736.763977-27707-47999784613134/AnsiballZ_command.py && sleep 0' 26764 1726882736.86808: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882736.86814: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882736.86847: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 <<< 26764 1726882736.86853: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882736.86863: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 26764 1726882736.86871: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 26764 1726882736.86880: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882736.86885: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882736.86935: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882736.86959: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882736.87058: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882737.44966: stdout chunk (state=3): >>> {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 1418 0 --:--:-- --:--:-- --:--:-- 1425\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 1392 0 --:--:-- --:--:-- --:--:-- 1399", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 21:38:56.999333", "end": "2024-09-20 21:38:57.447498", "delta": "0:00:00.448165", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 26764 1726882737.46269: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. <<< 26764 1726882737.46325: stderr chunk (state=3): >>><<< 26764 1726882737.46329: stdout chunk (state=3): >>><<< 26764 1726882737.46346: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 1418 0 --:--:-- --:--:-- --:--:-- 1425\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 1392 0 --:--:-- --:--:-- --:--:-- 1399", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 21:38:56.999333", "end": "2024-09-20 21:38:57.447498", "delta": "0:00:00.448165", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.158 closed. 26764 1726882737.46383: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts "$host"; then\n echo FAILED to lookup host "$host"\n exit 1\n fi\n if ! curl -o /dev/null https://"$host"; then\n echo FAILED to contact host "$host"\n exit 1\n fi\ndone\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882736.763977-27707-47999784613134/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 26764 1726882737.46392: _low_level_execute_command(): starting 26764 1726882737.46395: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882736.763977-27707-47999784613134/ > /dev/null 2>&1 && sleep 0' 26764 1726882737.46871: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882737.46874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 26764 1726882737.46922: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found <<< 26764 1726882737.46925: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 26764 1726882737.46928: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 26764 1726882737.46981: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 26764 1726882737.46985: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 26764 1726882737.46995: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 26764 1726882737.47102: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 26764 1726882737.48922: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 26764 1726882737.48973: stderr chunk (state=3): >>><<< 26764 1726882737.48978: stdout chunk (state=3): >>><<< 26764 1726882737.48991: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.158 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.158 originally 10.31.11.158 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 26764 1726882737.48998: handler run complete 26764 1726882737.49016: Evaluated conditional (False): False 26764 1726882737.49023: attempt loop complete, returning result 26764 1726882737.49026: _execute() done 26764 1726882737.49028: dumping result to json 26764 1726882737.49034: done dumping result, returning 26764 1726882737.49044: done running TaskExecutor() for managed_node2/TASK: Verify DNS and network connectivity [0e448fcc-3ce9-9875-c9a3-00000000047d] 26764 1726882737.49047: sending task result for task 0e448fcc-3ce9-9875-c9a3-00000000047d 26764 1726882737.49154: done sending task result for task 0e448fcc-3ce9-9875-c9a3-00000000047d 26764 1726882737.49156: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "delta": "0:00:00.448165", "end": "2024-09-20 21:38:57.447498", "rc": 0, "start": "2024-09-20 21:38:56.999333" } STDOUT: CHECK DNS AND CONNECTIVITY 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org STDERR: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 305 100 305 0 0 1418 0 --:--:-- --:--:-- --:--:-- 1425 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 291 100 291 0 0 1392 0 --:--:-- --:--:-- --:--:-- 1399 26764 1726882737.49240: no more pending results, returning what we have 26764 1726882737.49244: results queue empty 26764 1726882737.49245: checking for any_errors_fatal 26764 1726882737.49254: done checking for any_errors_fatal 26764 1726882737.49255: checking for max_fail_percentage 26764 1726882737.49257: done checking for max_fail_percentage 26764 1726882737.49258: checking to see if all hosts have failed and the running result is not ok 26764 1726882737.49259: done checking to see if all hosts have failed 26764 1726882737.49259: getting the remaining hosts for this loop 26764 1726882737.49261: done getting the remaining hosts for this loop 26764 1726882737.49267: getting the next task for host managed_node2 26764 1726882737.49277: done getting next task for host managed_node2 26764 1726882737.49279: ^ task is: TASK: meta (flush_handlers) 26764 1726882737.49281: ^ state is: HOST STATE: block=6, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882737.49285: getting variables 26764 1726882737.49286: in VariableManager get_vars() 26764 1726882737.49319: Calling all_inventory to load vars for managed_node2 26764 1726882737.49321: Calling groups_inventory to load vars for managed_node2 26764 1726882737.49323: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882737.49333: Calling all_plugins_play to load vars for managed_node2 26764 1726882737.49336: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882737.49338: Calling groups_plugins_play to load vars for managed_node2 26764 1726882737.50229: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882737.51161: done with get_vars() 26764 1726882737.51178: done getting variables 26764 1726882737.51226: in VariableManager get_vars() 26764 1726882737.51236: Calling all_inventory to load vars for managed_node2 26764 1726882737.51237: Calling groups_inventory to load vars for managed_node2 26764 1726882737.51238: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882737.51243: Calling all_plugins_play to load vars for managed_node2 26764 1726882737.51244: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882737.51246: Calling groups_plugins_play to load vars for managed_node2 26764 1726882737.51940: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882737.52874: done with get_vars() 26764 1726882737.52892: done queuing things up, now waiting for results queue to drain 26764 1726882737.52893: results queue empty 26764 1726882737.52894: checking for any_errors_fatal 26764 1726882737.52896: done checking for any_errors_fatal 26764 1726882737.52896: checking for max_fail_percentage 26764 1726882737.52897: done checking for max_fail_percentage 26764 1726882737.52897: checking to see if all hosts have failed and the running result is not ok 26764 1726882737.52898: done checking to see if all hosts have failed 26764 1726882737.52898: getting the remaining hosts for this loop 26764 1726882737.52899: done getting the remaining hosts for this loop 26764 1726882737.52901: getting the next task for host managed_node2 26764 1726882737.52904: done getting next task for host managed_node2 26764 1726882737.52905: ^ task is: TASK: meta (flush_handlers) 26764 1726882737.52906: ^ state is: HOST STATE: block=7, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882737.52907: getting variables 26764 1726882737.52908: in VariableManager get_vars() 26764 1726882737.52914: Calling all_inventory to load vars for managed_node2 26764 1726882737.52916: Calling groups_inventory to load vars for managed_node2 26764 1726882737.52917: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882737.52920: Calling all_plugins_play to load vars for managed_node2 26764 1726882737.52922: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882737.52924: Calling groups_plugins_play to load vars for managed_node2 26764 1726882737.53625: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882737.54525: done with get_vars() 26764 1726882737.54538: done getting variables 26764 1726882737.54573: in VariableManager get_vars() 26764 1726882737.54582: Calling all_inventory to load vars for managed_node2 26764 1726882737.54584: Calling groups_inventory to load vars for managed_node2 26764 1726882737.54585: Calling all_plugins_inventory to load vars for managed_node2 26764 1726882737.54589: Calling all_plugins_play to load vars for managed_node2 26764 1726882737.54594: Calling groups_plugins_inventory to load vars for managed_node2 26764 1726882737.54596: Calling groups_plugins_play to load vars for managed_node2 26764 1726882737.55255: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 26764 1726882737.56176: done with get_vars() 26764 1726882737.56192: done queuing things up, now waiting for results queue to drain 26764 1726882737.56194: results queue empty 26764 1726882737.56194: checking for any_errors_fatal 26764 1726882737.56195: done checking for any_errors_fatal 26764 1726882737.56195: checking for max_fail_percentage 26764 1726882737.56196: done checking for max_fail_percentage 26764 1726882737.56196: checking to see if all hosts have failed and the running result is not ok 26764 1726882737.56197: done checking to see if all hosts have failed 26764 1726882737.56197: getting the remaining hosts for this loop 26764 1726882737.56198: done getting the remaining hosts for this loop 26764 1726882737.56201: getting the next task for host managed_node2 26764 1726882737.56203: done getting next task for host managed_node2 26764 1726882737.56204: ^ task is: None 26764 1726882737.56205: ^ state is: HOST STATE: block=8, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 26764 1726882737.56205: done queuing things up, now waiting for results queue to drain 26764 1726882737.56206: results queue empty 26764 1726882737.56206: checking for any_errors_fatal 26764 1726882737.56207: done checking for any_errors_fatal 26764 1726882737.56207: checking for max_fail_percentage 26764 1726882737.56208: done checking for max_fail_percentage 26764 1726882737.56208: checking to see if all hosts have failed and the running result is not ok 26764 1726882737.56208: done checking to see if all hosts have failed 26764 1726882737.56210: getting the next task for host managed_node2 26764 1726882737.56212: done getting next task for host managed_node2 26764 1726882737.56213: ^ task is: None 26764 1726882737.56213: ^ state is: HOST STATE: block=8, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node2 : ok=58 changed=2 unreachable=0 failed=0 skipped=47 rescued=0 ignored=1 Friday 20 September 2024 21:38:57 -0400 (0:00:00.860) 0:00:23.504 ****** =============================================================================== fedora.linux_system_roles.network : Check which services are running ---- 1.78s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.72s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which packages are installed --- 1.18s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gathering Facts --------------------------------------------------------- 1.13s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_reapply_nm.yml:6 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 1.00s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Gathering Facts --------------------------------------------------------- 0.95s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_reapply.yml:7 fedora.linux_system_roles.network : Check which packages are installed --- 0.88s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Check if system is ostree ----------------------------------------------- 0.87s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Verify DNS and network connectivity ------------------------------------- 0.86s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Gather current interface info ------------------------------------------- 0.86s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Gather the minimum subset of ansible_facts required by the network role test --- 0.73s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.66s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Re-test connectivity ---------------- 0.64s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.64s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.59s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Get stat for interface rpltstbr ----------------------------------------- 0.53s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Get network_connections output ------------------------------------------ 0.46s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_reapply.yml:43 Deactivate the connection and remove the connection profile ------------- 0.43s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_reapply.yml:72 Check routes and DNS ---------------------------------------------------- 0.42s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 fedora.linux_system_roles.network : Re-test connectivity ---------------- 0.42s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 26764 1726882737.56295: RUNNING CLEANUP