[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 22225 1726882743.54400: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-AQL executable location = /usr/local/bin/ansible-playbook python version = 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 22225 1726882743.55512: Added group all to inventory 22225 1726882743.55515: Added group ungrouped to inventory 22225 1726882743.55520: Group all now contains ungrouped 22225 1726882743.55526: Examining possible inventory source: /tmp/network-mVt/inventory.yml 22225 1726882743.86601: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 22225 1726882743.87069: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 22225 1726882743.87094: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 22225 1726882743.87162: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 22225 1726882743.87358: Loaded config def from plugin (inventory/script) 22225 1726882743.87361: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 22225 1726882743.87403: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 22225 1726882743.87503: Loaded config def from plugin (inventory/yaml) 22225 1726882743.87505: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 22225 1726882743.87802: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 22225 1726882743.88946: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 22225 1726882743.88950: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 22225 1726882743.88953: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 22225 1726882743.88960: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 22225 1726882743.88965: Loading data from /tmp/network-mVt/inventory.yml 22225 1726882743.89040: /tmp/network-mVt/inventory.yml was not parsable by auto 22225 1726882743.89376: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 22225 1726882743.89419: Loading data from /tmp/network-mVt/inventory.yml 22225 1726882743.89573: group all already in inventory 22225 1726882743.89581: set inventory_file for managed_node1 22225 1726882743.89643: set inventory_dir for managed_node1 22225 1726882743.89644: Added host managed_node1 to inventory 22225 1726882743.89647: Added host managed_node1 to group all 22225 1726882743.89648: set ansible_host for managed_node1 22225 1726882743.89649: set ansible_ssh_extra_args for managed_node1 22225 1726882743.89654: set inventory_file for managed_node2 22225 1726882743.89657: set inventory_dir for managed_node2 22225 1726882743.89658: Added host managed_node2 to inventory 22225 1726882743.89660: Added host managed_node2 to group all 22225 1726882743.89661: set ansible_host for managed_node2 22225 1726882743.89662: set ansible_ssh_extra_args for managed_node2 22225 1726882743.89664: set inventory_file for managed_node3 22225 1726882743.89667: set inventory_dir for managed_node3 22225 1726882743.89668: Added host managed_node3 to inventory 22225 1726882743.89669: Added host managed_node3 to group all 22225 1726882743.89670: set ansible_host for managed_node3 22225 1726882743.89671: set ansible_ssh_extra_args for managed_node3 22225 1726882743.89673: Reconcile groups and hosts in inventory. 22225 1726882743.89678: Group ungrouped now contains managed_node1 22225 1726882743.89680: Group ungrouped now contains managed_node2 22225 1726882743.89681: Group ungrouped now contains managed_node3 22225 1726882743.89875: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 22225 1726882743.90237: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 22225 1726882743.90399: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 22225 1726882743.90432: Loaded config def from plugin (vars/host_group_vars) 22225 1726882743.90434: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 22225 1726882743.90442: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 22225 1726882743.90451: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 22225 1726882743.90638: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 22225 1726882743.91716: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882743.91817: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 22225 1726882743.92217: Loaded config def from plugin (connection/local) 22225 1726882743.92221: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 22225 1726882743.93767: Loaded config def from plugin (connection/paramiko_ssh) 22225 1726882743.93771: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 22225 1726882743.95645: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 22225 1726882743.95688: Loaded config def from plugin (connection/psrp) 22225 1726882743.95691: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 22225 1726882743.97416: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 22225 1726882743.97666: Loaded config def from plugin (connection/ssh) 22225 1726882743.97670: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 22225 1726882744.02359: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 22225 1726882744.02405: Loaded config def from plugin (connection/winrm) 22225 1726882744.02408: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 22225 1726882744.02558: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 22225 1726882744.02714: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 22225 1726882744.02908: Loaded config def from plugin (shell/cmd) 22225 1726882744.02911: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 22225 1726882744.02942: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 22225 1726882744.03156: Loaded config def from plugin (shell/powershell) 22225 1726882744.03159: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 22225 1726882744.03458: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 22225 1726882744.03862: Loaded config def from plugin (shell/sh) 22225 1726882744.03865: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 22225 1726882744.03904: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 22225 1726882744.04444: Loaded config def from plugin (become/runas) 22225 1726882744.04446: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 22225 1726882744.04857: Loaded config def from plugin (become/su) 22225 1726882744.04859: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 22225 1726882744.05243: Loaded config def from plugin (become/sudo) 22225 1726882744.05246: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 22225 1726882744.05286: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_nm.yml 22225 1726882744.06251: in VariableManager get_vars() 22225 1726882744.06274: done with get_vars() 22225 1726882744.06419: trying /usr/local/lib/python3.12/site-packages/ansible/modules 22225 1726882744.14406: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 22225 1726882744.14986: in VariableManager get_vars() 22225 1726882744.14991: done with get_vars() 22225 1726882744.14994: variable 'playbook_dir' from source: magic vars 22225 1726882744.14995: variable 'ansible_playbook_python' from source: magic vars 22225 1726882744.14996: variable 'ansible_config_file' from source: magic vars 22225 1726882744.14997: variable 'groups' from source: magic vars 22225 1726882744.14998: variable 'omit' from source: magic vars 22225 1726882744.14999: variable 'ansible_version' from source: magic vars 22225 1726882744.15000: variable 'ansible_check_mode' from source: magic vars 22225 1726882744.15000: variable 'ansible_diff_mode' from source: magic vars 22225 1726882744.15001: variable 'ansible_forks' from source: magic vars 22225 1726882744.15002: variable 'ansible_inventory_sources' from source: magic vars 22225 1726882744.15003: variable 'ansible_skip_tags' from source: magic vars 22225 1726882744.15004: variable 'ansible_limit' from source: magic vars 22225 1726882744.15004: variable 'ansible_run_tags' from source: magic vars 22225 1726882744.15005: variable 'ansible_verbosity' from source: magic vars 22225 1726882744.15047: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml 22225 1726882744.16214: in VariableManager get_vars() 22225 1726882744.16275: done with get_vars() 22225 1726882744.16316: in VariableManager get_vars() 22225 1726882744.16489: done with get_vars() 22225 1726882744.16944: in VariableManager get_vars() 22225 1726882744.17074: done with get_vars() 22225 1726882744.17080: variable 'omit' from source: magic vars 22225 1726882744.17101: variable 'omit' from source: magic vars 22225 1726882744.17143: in VariableManager get_vars() 22225 1726882744.17155: done with get_vars() 22225 1726882744.17319: in VariableManager get_vars() 22225 1726882744.17530: done with get_vars() 22225 1726882744.17571: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 22225 1726882744.18306: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 22225 1726882744.18853: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 22225 1726882744.21127: in VariableManager get_vars() 22225 1726882744.21149: done with get_vars() 22225 1726882744.22301: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 22225 1726882744.22863: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 22225 1726882744.28646: in VariableManager get_vars() 22225 1726882744.28668: done with get_vars() 22225 1726882744.28709: in VariableManager get_vars() 22225 1726882744.29260: done with get_vars() 22225 1726882744.31166: in VariableManager get_vars() 22225 1726882744.31188: done with get_vars() 22225 1726882744.31195: variable 'omit' from source: magic vars 22225 1726882744.31207: variable 'omit' from source: magic vars 22225 1726882744.31245: in VariableManager get_vars() 22225 1726882744.31260: done with get_vars() 22225 1726882744.31281: in VariableManager get_vars() 22225 1726882744.31297: done with get_vars() 22225 1726882744.31541: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 22225 1726882744.31877: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 22225 1726882744.32342: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 22225 1726882744.37163: in VariableManager get_vars() 22225 1726882744.37192: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 22225 1726882744.42292: in VariableManager get_vars() 22225 1726882744.42317: done with get_vars() 22225 1726882744.42663: in VariableManager get_vars() 22225 1726882744.42684: done with get_vars() 22225 1726882744.42744: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 22225 1726882744.42759: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 22225 1726882744.43417: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 22225 1726882744.43602: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 22225 1726882744.43605: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-AQL/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) 22225 1726882744.43846: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 22225 1726882744.43878: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 22225 1726882744.44276: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 22225 1726882744.44543: Loaded config def from plugin (callback/default) 22225 1726882744.44546: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 22225 1726882744.60020: Loaded config def from plugin (callback/junit) 22225 1726882744.60025: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 22225 1726882744.60084: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 22225 1726882744.60366: Loaded config def from plugin (callback/minimal) 22225 1726882744.60369: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 22225 1726882744.60413: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 22225 1726882744.60481: Loaded config def from plugin (callback/tree) 22225 1726882744.60484: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 22225 1726882744.60818: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 22225 1726882744.60821: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-AQL/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_ipv6_nm.yml **************************************************** 2 plays in /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_nm.yml 22225 1726882744.60854: in VariableManager get_vars() 22225 1726882744.60870: done with get_vars() 22225 1726882744.60877: in VariableManager get_vars() 22225 1726882744.60886: done with get_vars() 22225 1726882744.60899: variable 'omit' from source: magic vars 22225 1726882744.60945: in VariableManager get_vars() 22225 1726882744.60961: done with get_vars() 22225 1726882744.60985: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_ipv6.yml' with nm as provider] ************* 22225 1726882744.62404: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 22225 1726882744.62486: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 22225 1726882744.62544: getting the remaining hosts for this loop 22225 1726882744.62547: done getting the remaining hosts for this loop 22225 1726882744.62550: getting the next task for host managed_node1 22225 1726882744.62554: done getting next task for host managed_node1 22225 1726882744.62556: ^ task is: TASK: Gathering Facts 22225 1726882744.62558: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882744.62561: getting variables 22225 1726882744.62562: in VariableManager get_vars() 22225 1726882744.62573: Calling all_inventory to load vars for managed_node1 22225 1726882744.62576: Calling groups_inventory to load vars for managed_node1 22225 1726882744.62579: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882744.62593: Calling all_plugins_play to load vars for managed_node1 22225 1726882744.62606: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882744.62610: Calling groups_plugins_play to load vars for managed_node1 22225 1726882744.62853: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882744.62911: done with get_vars() 22225 1726882744.62918: done getting variables 22225 1726882744.62990: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_nm.yml:6 Friday 20 September 2024 21:39:04 -0400 (0:00:00.022) 0:00:00.022 ****** 22225 1726882744.63014: entering _queue_task() for managed_node1/gather_facts 22225 1726882744.63016: Creating lock for gather_facts 22225 1726882744.63790: worker is 1 (out of 1 available) 22225 1726882744.63801: exiting _queue_task() for managed_node1/gather_facts 22225 1726882744.63816: done queuing things up, now waiting for results queue to drain 22225 1726882744.63819: waiting for pending results... 22225 1726882744.64541: running TaskExecutor() for managed_node1/TASK: Gathering Facts 22225 1726882744.64631: in run() - task 0affc7ec-ae25-ec05-55b7-0000000000b9 22225 1726882744.64635: variable 'ansible_search_path' from source: unknown 22225 1726882744.64639: calling self._execute() 22225 1726882744.64708: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882744.64857: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882744.64874: variable 'omit' from source: magic vars 22225 1726882744.65060: variable 'omit' from source: magic vars 22225 1726882744.65095: variable 'omit' from source: magic vars 22225 1726882744.65135: variable 'omit' from source: magic vars 22225 1726882744.65429: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22225 1726882744.65432: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22225 1726882744.65439: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22225 1726882744.65461: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882744.65475: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882744.65507: variable 'inventory_hostname' from source: host vars for 'managed_node1' 22225 1726882744.65544: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882744.65552: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882744.65767: Set connection var ansible_connection to ssh 22225 1726882744.65783: Set connection var ansible_pipelining to False 22225 1726882744.65837: Set connection var ansible_shell_executable to /bin/sh 22225 1726882744.65847: Set connection var ansible_timeout to 10 22225 1726882744.65852: Set connection var ansible_shell_type to sh 22225 1726882744.65864: Set connection var ansible_module_compression to ZIP_DEFLATED 22225 1726882744.65892: variable 'ansible_shell_executable' from source: unknown 22225 1726882744.65978: variable 'ansible_connection' from source: unknown 22225 1726882744.65986: variable 'ansible_module_compression' from source: unknown 22225 1726882744.65993: variable 'ansible_shell_type' from source: unknown 22225 1726882744.65999: variable 'ansible_shell_executable' from source: unknown 22225 1726882744.66004: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882744.66010: variable 'ansible_pipelining' from source: unknown 22225 1726882744.66015: variable 'ansible_timeout' from source: unknown 22225 1726882744.66023: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882744.66466: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22225 1726882744.66535: variable 'omit' from source: magic vars 22225 1726882744.66545: starting attempt loop 22225 1726882744.66730: running the handler 22225 1726882744.66734: variable 'ansible_facts' from source: unknown 22225 1726882744.66736: _low_level_execute_command(): starting 22225 1726882744.66738: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22225 1726882744.68369: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22225 1726882744.68403: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882744.68469: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882744.68690: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882744.68710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882744.68835: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882744.68879: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 22225 1726882744.70909: stdout chunk (state=3): >>>/root <<< 22225 1726882744.70934: stdout chunk (state=3): >>><<< 22225 1726882744.71005: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882744.71017: stderr chunk (state=3): >>><<< 22225 1726882744.71053: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 22225 1726882744.71333: _low_level_execute_command(): starting 22225 1726882744.71338: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882744.7122989-22250-4465326278732 `" && echo ansible-tmp-1726882744.7122989-22250-4465326278732="` echo /root/.ansible/tmp/ansible-tmp-1726882744.7122989-22250-4465326278732 `" ) && sleep 0' 22225 1726882744.72329: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882744.72339: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882744.72580: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882744.72695: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 22225 1726882744.75155: stdout chunk (state=3): >>>ansible-tmp-1726882744.7122989-22250-4465326278732=/root/.ansible/tmp/ansible-tmp-1726882744.7122989-22250-4465326278732 <<< 22225 1726882744.75168: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882744.75232: stderr chunk (state=3): >>><<< 22225 1726882744.75314: stdout chunk (state=3): >>><<< 22225 1726882744.75346: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882744.7122989-22250-4465326278732=/root/.ansible/tmp/ansible-tmp-1726882744.7122989-22250-4465326278732 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 22225 1726882744.75395: variable 'ansible_module_compression' from source: unknown 22225 1726882744.75734: ANSIBALLZ: Using generic lock for ansible.legacy.setup 22225 1726882744.75738: ANSIBALLZ: Acquiring lock 22225 1726882744.75741: ANSIBALLZ: Lock acquired: 140272895053888 22225 1726882744.75743: ANSIBALLZ: Creating module 22225 1726882745.46584: ANSIBALLZ: Writing module into payload 22225 1726882745.46751: ANSIBALLZ: Writing module 22225 1726882745.46781: ANSIBALLZ: Renaming module 22225 1726882745.46813: ANSIBALLZ: Done creating module 22225 1726882745.46838: variable 'ansible_facts' from source: unknown 22225 1726882745.46844: variable 'inventory_hostname' from source: host vars for 'managed_node1' 22225 1726882745.46850: _low_level_execute_command(): starting 22225 1726882745.46855: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 22225 1726882745.47841: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882745.48254: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882745.48257: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882745.48260: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882745.48262: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882745.50128: stdout chunk (state=3): >>>PLATFORM <<< 22225 1726882745.50242: stdout chunk (state=3): >>>Linux <<< 22225 1726882745.50263: stdout chunk (state=3): >>>FOUND <<< 22225 1726882745.50288: stdout chunk (state=3): >>>/usr/bin/python3.12 <<< 22225 1726882745.50307: stdout chunk (state=3): >>>/usr/bin/python3 /usr/bin/python3 <<< 22225 1726882745.50317: stdout chunk (state=3): >>>ENDFOUND <<< 22225 1726882745.50611: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882745.50851: stderr chunk (state=3): >>><<< 22225 1726882745.50854: stdout chunk (state=3): >>><<< 22225 1726882745.50874: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882745.50908 [managed_node1]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 22225 1726882745.50937: _low_level_execute_command(): starting 22225 1726882745.50943: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 22225 1726882745.51230: Sending initial data 22225 1726882745.51233: Sent initial data (1181 bytes) 22225 1726882745.52328: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882745.52332: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882745.52655: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882745.52682: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882745.52762: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882745.57393: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"Fedora Linux\"\nVERSION=\"40 (Forty)\"\nID=fedora\nVERSION_ID=40\nVERSION_CODENAME=\"\"\nPLATFORM_ID=\"platform:f40\"\nPRETTY_NAME=\"Fedora Linux 40 (Forty)\"\nANSI_COLOR=\"0;38;2;60;110;180\"\nLOGO=fedora-logo-icon\nCPE_NAME=\"cpe:/o:fedoraproject:fedora:40\"\nDEFAULT_HOSTNAME=\"fedora\"\nHOME_URL=\"https://fedoraproject.org/\"\nDOCUMENTATION_URL=\"https://docs.fedoraproject.org/en-US/fedora/f40/system-administrators-guide/\"\nSUPPORT_URL=\"https://ask.fedoraproject.org/\"\nBUG_REPORT_URL=\"https://bugzilla.redhat.com/\"\nREDHAT_BUGZILLA_PRODUCT=\"Fedora\"\nREDHAT_BUGZILLA_PRODUCT_VERSION=40\nREDHAT_SUPPORT_PRODUCT=\"Fedora\"\nREDHAT_SUPPORT_PRODUCT_VERSION=40\nSUPPORT_END=2025-05-13\n"} <<< 22225 1726882745.57889: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882745.57892: stdout chunk (state=3): >>><<< 22225 1726882745.57897: stderr chunk (state=3): >>><<< 22225 1726882745.57973: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"Fedora Linux\"\nVERSION=\"40 (Forty)\"\nID=fedora\nVERSION_ID=40\nVERSION_CODENAME=\"\"\nPLATFORM_ID=\"platform:f40\"\nPRETTY_NAME=\"Fedora Linux 40 (Forty)\"\nANSI_COLOR=\"0;38;2;60;110;180\"\nLOGO=fedora-logo-icon\nCPE_NAME=\"cpe:/o:fedoraproject:fedora:40\"\nDEFAULT_HOSTNAME=\"fedora\"\nHOME_URL=\"https://fedoraproject.org/\"\nDOCUMENTATION_URL=\"https://docs.fedoraproject.org/en-US/fedora/f40/system-administrators-guide/\"\nSUPPORT_URL=\"https://ask.fedoraproject.org/\"\nBUG_REPORT_URL=\"https://bugzilla.redhat.com/\"\nREDHAT_BUGZILLA_PRODUCT=\"Fedora\"\nREDHAT_BUGZILLA_PRODUCT_VERSION=40\nREDHAT_SUPPORT_PRODUCT=\"Fedora\"\nREDHAT_SUPPORT_PRODUCT_VERSION=40\nSUPPORT_END=2025-05-13\n"} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882745.58429: variable 'ansible_facts' from source: unknown 22225 1726882745.58432: variable 'ansible_facts' from source: unknown 22225 1726882745.58436: variable 'ansible_module_compression' from source: unknown 22225 1726882745.58546: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-2222502xecypi/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 22225 1726882745.58773: variable 'ansible_facts' from source: unknown 22225 1726882745.59159: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882744.7122989-22250-4465326278732/AnsiballZ_setup.py 22225 1726882745.59772: Sending initial data 22225 1726882745.59775: Sent initial data (152 bytes) 22225 1726882745.60971: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882745.61076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882745.61352: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882745.61356: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882745.63397: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22225 1726882745.63545: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882744.7122989-22250-4465326278732/AnsiballZ_setup.py" <<< 22225 1726882745.63548: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2222502xecypi/tmp7ohbepph /root/.ansible/tmp/ansible-tmp-1726882744.7122989-22250-4465326278732/AnsiballZ_setup.py <<< 22225 1726882745.63728: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-2222502xecypi/tmp7ohbepph" to remote "/root/.ansible/tmp/ansible-tmp-1726882744.7122989-22250-4465326278732/AnsiballZ_setup.py" <<< 22225 1726882745.63732: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882744.7122989-22250-4465326278732/AnsiballZ_setup.py" <<< 22225 1726882745.65795: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882745.66272: stderr chunk (state=3): >>><<< 22225 1726882745.66276: stdout chunk (state=3): >>><<< 22225 1726882745.66279: done transferring module to remote 22225 1726882745.66281: _low_level_execute_command(): starting 22225 1726882745.66283: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882744.7122989-22250-4465326278732/ /root/.ansible/tmp/ansible-tmp-1726882744.7122989-22250-4465326278732/AnsiballZ_setup.py && sleep 0' 22225 1726882745.67729: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22225 1726882745.68029: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882745.68343: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882745.68346: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882745.70274: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882745.70367: stderr chunk (state=3): >>><<< 22225 1726882745.70437: stdout chunk (state=3): >>><<< 22225 1726882745.70461: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882745.70464: _low_level_execute_command(): starting 22225 1726882745.70470: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882744.7122989-22250-4465326278732/AnsiballZ_setup.py && sleep 0' 22225 1726882745.71509: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22225 1726882745.71740: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882745.71751: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882745.71765: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882745.71777: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 <<< 22225 1726882745.71787: stderr chunk (state=3): >>>debug2: match not found <<< 22225 1726882745.71797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882745.71811: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 22225 1726882745.71819: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.7 is address <<< 22225 1726882745.71828: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 22225 1726882745.71836: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882745.71846: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882745.71857: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882745.71865: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 <<< 22225 1726882745.71944: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882745.72158: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882745.72218: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882745.74679: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 22225 1726882745.74684: stdout chunk (state=3): >>>import _imp # builtin <<< 22225 1726882745.74686: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 22225 1726882745.74787: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 22225 1726882745.74827: stdout chunk (state=3): >>>import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook <<< 22225 1726882745.74858: stdout chunk (state=3): >>>import 'time' # <<< 22225 1726882745.74864: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 22225 1726882745.74935: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 22225 1726882745.74969: stdout chunk (state=3): >>>import '_codecs' # <<< 22225 1726882745.74973: stdout chunk (state=3): >>>import 'codecs' # <<< 22225 1726882745.75112: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f70050a8530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7005077b30> <<< 22225 1726882745.75116: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 22225 1726882745.75118: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f70050aaab0> <<< 22225 1726882745.75121: stdout chunk (state=3): >>>import '_signal' # <<< 22225 1726882745.75150: stdout chunk (state=3): >>>import '_abc' # <<< 22225 1726882745.75208: stdout chunk (state=3): >>>import 'abc' # import 'io' # import '_stat' # import 'stat' # <<< 22225 1726882745.75329: stdout chunk (state=3): >>>import '_collections_abc' # <<< 22225 1726882745.75332: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 22225 1726882745.75426: stdout chunk (state=3): >>>import 'os' # <<< 22225 1726882745.75430: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 22225 1726882745.75480: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' <<< 22225 1726882745.75496: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004e59190> <<< 22225 1726882745.75555: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 22225 1726882745.75580: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004e59fd0> <<< 22225 1726882745.75702: stdout chunk (state=3): >>>import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 22225 1726882745.76089: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 22225 1726882745.76093: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 22225 1726882745.76096: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 22225 1726882745.76098: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 22225 1726882745.76113: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 22225 1726882745.76198: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 22225 1726882745.76202: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 22225 1726882745.76204: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 22225 1726882745.76308: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004e97da0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 22225 1726882745.76312: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 22225 1726882745.76315: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004e97fe0> <<< 22225 1726882745.76317: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 22225 1726882745.76344: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 22225 1726882745.76360: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 22225 1726882745.76415: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 22225 1726882745.76434: stdout chunk (state=3): >>>import 'itertools' # <<< 22225 1726882745.76533: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004ecf7d0> <<< 22225 1726882745.76536: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 22225 1726882745.76538: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004ecfe60> <<< 22225 1726882745.76540: stdout chunk (state=3): >>>import '_collections' # <<< 22225 1726882745.76581: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004eafa70> <<< 22225 1726882745.76594: stdout chunk (state=3): >>>import '_functools' # <<< 22225 1726882745.76620: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004ead190> <<< 22225 1726882745.76830: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004e94f50> <<< 22225 1726882745.76836: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 22225 1726882745.76841: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 22225 1726882745.76844: stdout chunk (state=3): >>>import '_sre' # <<< 22225 1726882745.76857: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 22225 1726882745.76904: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004ef3770> <<< 22225 1726882745.76927: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004ef2390> <<< 22225 1726882745.77133: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004eae060> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004ef0bf0> <<< 22225 1726882745.77137: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004f20800> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004e941d0> <<< 22225 1726882745.77140: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py <<< 22225 1726882745.77148: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7004f20cb0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004f20b60> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 22225 1726882745.77155: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7004f20f50> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004e92cf0> <<< 22225 1726882745.77189: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 22225 1726882745.77209: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 22225 1726882745.77352: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 22225 1726882745.77356: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004f21610> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004f212e0> <<< 22225 1726882745.77359: stdout chunk (state=3): >>>import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004f22510> import 'importlib.util' # import 'runpy' # <<< 22225 1726882745.77377: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 22225 1726882745.77416: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 22225 1726882745.77441: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py <<< 22225 1726882745.77460: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004f3c710> <<< 22225 1726882745.77589: stdout chunk (state=3): >>>import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7004f3de50> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 22225 1726882745.77593: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py <<< 22225 1726882745.77595: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004f3ecf0> <<< 22225 1726882745.77641: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7004f3f350> <<< 22225 1726882745.77664: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004f3e240> <<< 22225 1726882745.77736: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 22225 1726882745.77745: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 22225 1726882745.77820: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7004f3fda0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004f3f4d0> <<< 22225 1726882745.77825: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004f22570> <<< 22225 1726882745.77828: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 22225 1726882745.77989: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7004c77d10> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7004ca0740> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004ca04a0> <<< 22225 1726882745.78006: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7004ca06b0> <<< 22225 1726882745.78062: stdout chunk (state=3): >>># extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7004ca0920> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004c75eb0> <<< 22225 1726882745.78080: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 22225 1726882745.78372: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004ca1ee0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004ca0b60> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004f22c30> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 22225 1726882745.78437: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 22225 1726882745.78478: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004cce2a0> <<< 22225 1726882745.78505: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 22225 1726882745.78557: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 22225 1726882745.78562: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 22225 1726882745.78569: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 22225 1726882745.78666: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004ce63f0> <<< 22225 1726882745.78684: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 22225 1726882745.78794: stdout chunk (state=3): >>>import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004d23170> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 22225 1726882745.78990: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004d45910> <<< 22225 1726882745.79064: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004d23290> <<< 22225 1726882745.79134: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004ce7050> <<< 22225 1726882745.79147: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' <<< 22225 1726882745.79571: stdout chunk (state=3): >>>import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004b242c0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004ce5430> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004ca2e40> <<< 22225 1726882745.79575: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f7004ce5550> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_mpz9hy8p/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available <<< 22225 1726882745.79700: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882745.79729: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 22225 1726882745.79748: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 22225 1726882745.79856: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 22225 1726882745.79860: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 22225 1726882745.79895: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004b89fa0> <<< 22225 1726882745.79907: stdout chunk (state=3): >>>import '_typing' # <<< 22225 1726882745.80315: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004b60e90> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004b27fe0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available <<< 22225 1726882745.81799: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882745.83117: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004b63e30> <<< 22225 1726882745.83154: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py <<< 22225 1726882745.83177: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 22225 1726882745.83298: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7004bb9a60> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004bb97f0> <<< 22225 1726882745.83374: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004bb9100> <<< 22225 1726882745.83377: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 22225 1726882745.83384: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 22225 1726882745.83626: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004bb95e0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004b8ac30> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7004bba7e0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7004bbaa20> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004bbaf60> import 'pwd' # <<< 22225 1726882745.83650: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 22225 1726882745.83846: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004a20d10> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7004a22930> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004a23290> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 22225 1726882745.83921: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 22225 1726882745.83925: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004a24470> <<< 22225 1726882745.83929: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 22225 1726882745.84029: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 22225 1726882745.84033: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 22225 1726882745.84083: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004a26f30> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7004a27290> <<< 22225 1726882745.84102: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004a251f0> <<< 22225 1726882745.84144: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 22225 1726882745.84408: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 22225 1726882745.84425: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004a2af90> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004a29a60> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004a297c0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 22225 1726882745.84467: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004a2bf50> <<< 22225 1726882745.84499: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004a25700> <<< 22225 1726882745.84531: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7004a6f140> <<< 22225 1726882745.84554: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc'<<< 22225 1726882745.84568: stdout chunk (state=3): >>> import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004a6f320> <<< 22225 1726882745.84589: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 22225 1726882745.84626: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 22225 1726882745.84634: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py <<< 22225 1726882745.84641: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 22225 1726882745.84875: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7004a70e00> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004a70bc0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7004a73320> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004a714f0> <<< 22225 1726882745.84917: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 22225 1726882745.85011: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 22225 1726882745.85015: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 22225 1726882745.85017: stdout chunk (state=3): >>>import '_string' # <<< 22225 1726882745.85066: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004a7ea50> <<< 22225 1726882745.85289: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004a73410> <<< 22225 1726882745.85305: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7004a7fcb0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7004a7f9e0> <<< 22225 1726882745.85350: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7004a7fc80> <<< 22225 1726882745.85363: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004a6f530> <<< 22225 1726882745.85380: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 22225 1726882745.85449: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 22225 1726882745.85844: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7004a832c0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7004a847a0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004a81a60> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7004a82de0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004a816d0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 22225 1726882745.85864: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882745.85979: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # <<< 22225 1726882745.86002: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882745.86018: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 22225 1726882745.86099: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882745.86168: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882745.86339: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882745.86931: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882745.87660: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f700490c8f0> <<< 22225 1726882745.87987: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f700490d670> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004a871a0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available <<< 22225 1726882745.88314: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882745.88317: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 22225 1726882745.88320: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f700490d6a0> <<< 22225 1726882745.88739: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882745.88796: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882745.89631: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882745.89750: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882745.89871: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 22225 1726882745.89889: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882745.89943: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882745.89997: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 22225 1726882745.90008: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882745.90141: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882745.90277: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 22225 1726882745.90301: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882745.90312: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882745.90348: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 22225 1726882745.90450: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882745.90486: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 22225 1726882745.90510: stdout chunk (state=3): >>> <<< 22225 1726882745.90529: stdout chunk (state=3): >>># zipimport: zlib available<<< 22225 1726882745.90739: stdout chunk (state=3): >>> <<< 22225 1726882745.90957: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882745.91380: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 22225 1726882745.91464: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 22225 1726882745.91529: stdout chunk (state=3): >>>import '_ast' # <<< 22225 1726882745.91608: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f700490e300> <<< 22225 1726882745.91612: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882745.92064: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882745.92330: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 22225 1726882745.92576: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7004916300> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7004916c00> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f700490f3e0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 22225 1726882745.92693: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 22225 1726882745.92855: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7004915850> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004916d20> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 22225 1726882745.92862: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882745.92939: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py <<< 22225 1726882745.92950: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 22225 1726882745.92990: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 22225 1726882745.92994: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 22225 1726882745.93088: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 22225 1726882745.93092: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 22225 1726882745.93094: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 22225 1726882745.93187: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f70049aaf60> <<< 22225 1726882745.93225: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004923e00> <<< 22225 1726882745.93449: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f700491adb0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f700491ac00> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available <<< 22225 1726882745.93453: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882745.93455: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 22225 1726882745.93458: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 22225 1726882745.93557: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882745.93561: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # <<< 22225 1726882745.93563: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882745.93566: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882745.93827: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 22225 1726882745.93831: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882745.93834: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882745.93836: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 22225 1726882745.93853: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882745.94239: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 22225 1726882745.94271: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882745.94653: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 22225 1726882745.94665: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f70049b1e20> <<< 22225 1726882745.94696: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 22225 1726882745.94701: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 22225 1726882745.94741: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 22225 1726882745.94947: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7003f14620> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7003f149e0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004991670> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004990a70> <<< 22225 1726882745.94962: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f70049b0500> <<< 22225 1726882745.94979: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f70049b08f0> <<< 22225 1726882745.94990: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 22225 1726882745.95151: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7003f178c0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7003f17170> <<< 22225 1726882745.95170: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7003f17350> <<< 22225 1726882745.95335: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7003f165d0> <<< 22225 1726882745.95338: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 22225 1726882745.95474: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7003f179e0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7003f7e510> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7003f7c530> <<< 22225 1726882745.95478: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f70049b15e0> import 'ansible.module_utils.facts.timeout' # <<< 22225 1726882745.95582: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # <<< 22225 1726882745.95586: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 22225 1726882745.95588: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other' # <<< 22225 1726882745.95591: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882745.95829: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available <<< 22225 1726882745.95876: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882745.95907: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available <<< 22225 1726882745.96166: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available <<< 22225 1726882745.96268: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available <<< 22225 1726882745.96359: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882745.96450: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882745.96543: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # <<< 22225 1726882745.96562: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.system.cmdline' # <<< 22225 1726882745.96744: stdout chunk (state=3): >>> # zipimport: zlib available <<< 22225 1726882745.97726: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882745.98457: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 22225 1726882745.98468: stdout chunk (state=3): >>># zipimport: zlib available<<< 22225 1726882745.98480: stdout chunk (state=3): >>> <<< 22225 1726882745.98575: stdout chunk (state=3): >>># zipimport: zlib available<<< 22225 1726882745.98589: stdout chunk (state=3): >>> <<< 22225 1726882745.98682: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882745.98829: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available <<< 22225 1726882745.98943: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available <<< 22225 1726882745.98999: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 22225 1726882745.99015: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882745.99047: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882745.99071: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 22225 1726882745.99150: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available <<< 22225 1726882745.99366: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882745.99369: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 22225 1726882745.99371: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 22225 1726882745.99544: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7003f7fbc0> <<< 22225 1726882745.99563: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7003f7f2f0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available <<< 22225 1726882745.99636: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882745.99703: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 22225 1726882745.99714: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882745.99904: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882745.99908: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 22225 1726882746.00011: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882746.00014: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882746.00072: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 22225 1726882746.00082: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882746.00118: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882746.00433: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 22225 1726882746.00437: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7003fae630> <<< 22225 1726882746.00579: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7003f9a360> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available <<< 22225 1726882746.00647: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882746.00749: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 22225 1726882746.00752: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882746.00808: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882746.00910: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882746.01437: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available <<< 22225 1726882746.01509: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 22225 1726882746.01524: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882746.01691: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 22225 1726882746.01704: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 22225 1726882746.01844: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7003dc6570> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7003dc6180> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available <<< 22225 1726882746.01925: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882746.01930: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # <<< 22225 1726882746.01943: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882746.02210: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882746.02488: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 22225 1726882746.02507: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882746.02658: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882746.02847: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882746.02903: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882746.02993: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available <<< 22225 1726882746.03039: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882746.03061: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882746.03329: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882746.03552: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 22225 1726882746.03584: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882746.03836: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882746.04036: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 22225 1726882746.04040: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882746.04137: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 22225 1726882746.05213: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882746.06150: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # <<< 22225 1726882746.06178: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hurd' # <<< 22225 1726882746.06200: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882746.06366: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882746.06543: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 22225 1726882746.06628: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882746.06733: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882746.06916: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 22225 1726882746.06933: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882746.07219: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882746.07480: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 22225 1726882746.07504: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882746.07515: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 22225 1726882746.07802: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available <<< 22225 1726882746.07920: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882746.08014: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882746.08460: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882746.08798: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 22225 1726882746.08935: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882746.08938: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available <<< 22225 1726882746.09032: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available <<< 22225 1726882746.09150: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882746.09276: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available <<< 22225 1726882746.09295: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882746.09330: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # <<< 22225 1726882746.09359: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882746.09462: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882746.09514: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 22225 1726882746.09620: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 22225 1726882746.09702: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 22225 1726882746.09726: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882746.10230: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882746.10718: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 22225 1726882746.10730: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882746.10828: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882746.10924: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 22225 1726882746.10940: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882746.10999: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882746.11039: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 22225 1726882746.11054: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882746.11105: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882746.11220: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available <<< 22225 1726882746.11278: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available <<< 22225 1726882746.11579: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available <<< 22225 1726882746.11629: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available <<< 22225 1726882746.11845: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 22225 1726882746.11873: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882746.11955: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882746.12138: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882746.12228: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # <<< 22225 1726882746.12232: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.freebsd' # <<< 22225 1726882746.12235: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.dragonfly' # <<< 22225 1726882746.12237: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882746.12326: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882746.12406: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # <<< 22225 1726882746.12421: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882746.12929: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882746.13142: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 22225 1726882746.13159: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882746.13224: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882746.13296: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 22225 1726882746.13311: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882746.13429: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882746.13518: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available <<< 22225 1726882746.13845: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available <<< 22225 1726882746.13916: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882746.14076: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 22225 1726882746.14194: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882746.14726: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7003deeed0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7003ded9a0> <<< 22225 1726882746.14771: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7003deef90> <<< 22225 1726882747.85353: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7003e35910> <<< 22225 1726882747.85398: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' <<< 22225 1726882747.85437: stdout chunk (state=3): >>>import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7003e36660> <<< 22225 1726882747.85503: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' <<< 22225 1726882747.85572: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7003fa4dd0> <<< 22225 1726882747.85638: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7003fa48c0> <<< 22225 1726882747.85932: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 22225 1726882747.86146: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 22225 1726882748.06558: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDgEY5v0GEtCYtch7tJojQQAk4PbH2olIXeGHTQIm8P5pFfIFYEIOJVnRYq1iTcsBmLCTgtGM0uASSyO5gXiTJv1YA3W6bzq+KxIdoX/yvSXE7c8N6e/7sMCf9vq2o8xdS3RQTFVoQhj8zkjID057q3vE1D1ocBhYrbVTg76i1ZqUd+ePIBmv/FpJM5vb0YoL7gkfS3SFIJPuVIEqIfwZenELkhPa6MfTG3y/T8+Y4mRwbn7AmsxfBpApbj+TjvOT1vEjw0nBgVhen2pB+dpX/dtpPZiqrQgfcCF+gcf5MY2k2QbuXKKc1iESrpq3sm4as3n9bYr/2i3c3+5PTFN/CKnMpbejvhd+srQgr8UXt57pXIrXxBNe9thnfdeyp9mTxpprGooVC/CHvQ11TbipHGql4+nW9TSIg4s+WIoJGBb5REpn5hh2HmL0/W+Bhet//pxt9ENJxKyunBZToUbqQ2wjkR25JMThiZ6lKuzIRlIAK4i02pPGNUi28QgUDqdR8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKwVnf2m/NbfCFFK6wM50dwDuEJIaiTFh2d8mqI8HYkr65RHvztqJ3ibHa48thfVy5T7cZ8XqhpqkqfQd1OIshs=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIGUhLLUPg0qNUe+aRCNNpHWNDNP2CUtjsgQPqFujAjMC", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.10.9-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Sun Sep 8 17:23:55 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "ip-10-31-15-7.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-15-7", "ansible_nodename": "ip-10-31-15-7.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22da150943bd63960f74a27c08f190", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fibre_channel_wwn": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_is_chroot": false, "ansible_apparmor": {"status": "disabled"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_local": {}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_iscsi_iqn": "", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:b5954bb9-e972-4b2a-94f1-a82c77e96f77", "ansible_loadavg": {"1m": 0.75146484375, "5m": 0.6259765625, "15m": 0.375}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.180 60558 10.31.15.7 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.180 60558 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/u<<< 22225 1726882748.06577: stdout chunk (state=3): >>>sr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "39", "second": "07", "epoch": "1726882747", "epoch_int": "1726882747", "date": "2024-09-20", "time": "21:39:07", "iso8601_micro": "2024-09-21T01:39:07.652832Z", "iso8601": "2024-09-21T01:39:07Z", "iso8601_basic": "20240920T213907652832", "iso8601_basic_short": "20240920T213907", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fips": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_lsb": {}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:c5:8e:44:af", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.7", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:c5ff:fe8e:44af", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"<<< 22225 1726882748.06598: stdout chunk (state=3): >>>}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.7", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:c5:8e:44:af", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.7"], "ansible_all_ipv6_addresses": ["fe80::8ff:c5ff:fe8e:44af"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.7", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:c5ff:fe8e:44af"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3043, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 673, "free": 3043}, "nocache": {"free": 3456, "used": 260}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22da15-0943-bd63-960f-74a27c08f190", "ansible_product_uuid": "ec22da15-0943-bd63-960f-74a27c08f190", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 706, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251373637632, "block_size": 4096, "block_total": 64483404, "block_available": 61370517, "block_used": 3112887, "inode_total": 16384000, "inode_available": 16303047, "inode_used": 80953, "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"}], "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 22225 1726882748.07261: stdout chunk (state=3): >>># clear sys.path_importer_cache<<< 22225 1726882748.07298: stdout chunk (state=3): >>> <<< 22225 1726882748.07302: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ <<< 22225 1726882748.07331: stdout chunk (state=3): >>># clear sys.path<<< 22225 1726882748.07337: stdout chunk (state=3): >>> # clear sys.argv<<< 22225 1726882748.07383: stdout chunk (state=3): >>> # clear sys.ps1<<< 22225 1726882748.07387: stdout chunk (state=3): >>> # clear sys.ps2 # clear sys.last_exc <<< 22225 1726882748.07389: stdout chunk (state=3): >>># clear sys.last_type # clear sys.last_value<<< 22225 1726882748.07395: stdout chunk (state=3): >>> <<< 22225 1726882748.07431: stdout chunk (state=3): >>># clear sys.last_traceback <<< 22225 1726882748.07452: stdout chunk (state=3): >>># clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin<<< 22225 1726882748.07455: stdout chunk (state=3): >>> # restore sys.stdout <<< 22225 1726882748.07461: stdout chunk (state=3): >>># restore sys.stderr<<< 22225 1726882748.07697: stdout chunk (state=3): >>> # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanu<<< 22225 1726882748.07708: stdout chunk (state=3): >>>p[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix<<< 22225 1726882748.07712: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin<<< 22225 1726882748.07717: stdout chunk (state=3): >>> <<< 22225 1726882748.07783: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux <<< 22225 1726882748.07786: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd<<< 22225 1726882748.07796: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd<<< 22225 1726882748.07800: stdout chunk (state=3): >>> <<< 22225 1726882748.07823: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.hardware.sunos <<< 22225 1726882748.07843: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.network <<< 22225 1726882748.07861: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly<<< 22225 1726882748.07868: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi<<< 22225 1726882748.07901: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors<<< 22225 1726882748.07905: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other <<< 22225 1726882748.07928: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips<<< 22225 1726882748.07956: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr<<< 22225 1726882748.07969: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux<<< 22225 1726882748.08000: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd<<< 22225 1726882748.08006: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi<<< 22225 1726882748.08037: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl<<< 22225 1726882748.08044: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata<<< 22225 1726882748.08126: stdout chunk (state=3): >>> # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 22225 1726882748.08596: stdout chunk (state=3): >>># destroy _sitebuiltins<<< 22225 1726882748.08618: stdout chunk (state=3): >>> <<< 22225 1726882748.08637: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc<<< 22225 1726882748.08644: stdout chunk (state=3): >>> # destroy importlib.util <<< 22225 1726882748.08685: stdout chunk (state=3): >>># destroy _bz2<<< 22225 1726882748.08689: stdout chunk (state=3): >>> # destroy _compression <<< 22225 1726882748.08719: stdout chunk (state=3): >>># destroy _lzma # destroy binascii<<< 22225 1726882748.08728: stdout chunk (state=3): >>> # destroy zlib<<< 22225 1726882748.08745: stdout chunk (state=3): >>> # destroy bz2 # destroy lzma # destroy zipfile._path<<< 22225 1726882748.08771: stdout chunk (state=3): >>> # destroy zipfile <<< 22225 1726882748.08785: stdout chunk (state=3): >>># destroy pathlib <<< 22225 1726882748.08788: stdout chunk (state=3): >>># destroy zipfile._path.glob # destroy ipaddress<<< 22225 1726882748.08824: stdout chunk (state=3): >>> <<< 22225 1726882748.08852: stdout chunk (state=3): >>># destroy ntpath<<< 22225 1726882748.08872: stdout chunk (state=3): >>> <<< 22225 1726882748.08875: stdout chunk (state=3): >>># destroy importlib<<< 22225 1726882748.08893: stdout chunk (state=3): >>> <<< 22225 1726882748.08901: stdout chunk (state=3): >>># destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib<<< 22225 1726882748.08916: stdout chunk (state=3): >>> # destroy json.decoder<<< 22225 1726882748.08937: stdout chunk (state=3): >>> # destroy json.encoder # destroy json.scanner<<< 22225 1726882748.08941: stdout chunk (state=3): >>> # destroy _json<<< 22225 1726882748.08973: stdout chunk (state=3): >>> # destroy grp<<< 22225 1726882748.08977: stdout chunk (state=3): >>> # destroy encodings # destroy _locale <<< 22225 1726882748.08999: stdout chunk (state=3): >>># destroy locale<<< 22225 1726882748.09025: stdout chunk (state=3): >>> # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog <<< 22225 1726882748.09110: stdout chunk (state=3): >>># destroy uuid # destroy _hashlib # destroy _blake2 <<< 22225 1726882748.09128: stdout chunk (state=3): >>># destroy selinux # destroy shutil <<< 22225 1726882748.09176: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy logging<<< 22225 1726882748.09284: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector<<< 22225 1726882748.09287: stdout chunk (state=3): >>> <<< 22225 1726882748.09293: stdout chunk (state=3): >>># destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize<<< 22225 1726882748.09332: stdout chunk (state=3): >>> # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle<<< 22225 1726882748.09359: stdout chunk (state=3): >>> # destroy _compat_pickle # destroy _pickle # destroy queue <<< 22225 1726882748.09394: stdout chunk (state=3): >>># destroy _heapq # destroy _queue # destroy multiprocessing.reduction <<< 22225 1726882748.09462: stdout chunk (state=3): >>># destroy selectors # destroy shlex # destroy fcntl <<< 22225 1726882748.09475: stdout chunk (state=3): >>># destroy datetime # destroy subprocess # destroy base64<<< 22225 1726882748.09545: stdout chunk (state=3): >>> # destroy _ssl <<< 22225 1726882748.09574: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass <<< 22225 1726882748.09597: stdout chunk (state=3): >>># destroy pwd # destroy termios # destroy json <<< 22225 1726882748.09653: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob <<< 22225 1726882748.09702: stdout chunk (state=3): >>># destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context <<< 22225 1726882748.09705: stdout chunk (state=3): >>># destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array<<< 22225 1726882748.09815: stdout chunk (state=3): >>> # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux <<< 22225 1726882748.09858: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc<<< 22225 1726882748.09903: stdout chunk (state=3): >>> # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache<<< 22225 1726882748.09933: stdout chunk (state=3): >>> # destroy textwrap # cleanup[3] wiping tokenize <<< 22225 1726882748.09973: stdout chunk (state=3): >>># cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit<<< 22225 1726882748.10015: stdout chunk (state=3): >>> # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random<<< 22225 1726882748.10114: stdout chunk (state=3): >>> # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc<<< 22225 1726882748.10137: stdout chunk (state=3): >>> # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator<<< 22225 1726882748.10176: stdout chunk (state=3): >>> # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc <<< 22225 1726882748.10186: stdout chunk (state=3): >>># cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs<<< 22225 1726882748.10190: stdout chunk (state=3): >>> # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external <<< 22225 1726882748.10213: stdout chunk (state=3): >>># cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp <<< 22225 1726882748.10244: stdout chunk (state=3): >>># cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys <<< 22225 1726882748.10265: stdout chunk (state=3): >>># cleanup[3] wiping builtins # destroy selinux._selinux<<< 22225 1726882748.10268: stdout chunk (state=3): >>> # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader<<< 22225 1726882748.10325: stdout chunk (state=3): >>> # destroy systemd._journal # destroy _datetime <<< 22225 1726882748.10523: stdout chunk (state=3): >>># destroy sys.monitoring<<< 22225 1726882748.10538: stdout chunk (state=3): >>> <<< 22225 1726882748.10547: stdout chunk (state=3): >>># destroy _socket <<< 22225 1726882748.10572: stdout chunk (state=3): >>># destroy _collections<<< 22225 1726882748.10581: stdout chunk (state=3): >>> <<< 22225 1726882748.10611: stdout chunk (state=3): >>># destroy platform<<< 22225 1726882748.10629: stdout chunk (state=3): >>> # destroy _uuid<<< 22225 1726882748.10634: stdout chunk (state=3): >>> # destroy stat # destroy genericpath # destroy re._parser<<< 22225 1726882748.10656: stdout chunk (state=3): >>> # destroy tokenize<<< 22225 1726882748.10704: stdout chunk (state=3): >>> # destroy ansible.module_utils.six.moves.urllib<<< 22225 1726882748.10717: stdout chunk (state=3): >>> # destroy copyreg # destroy contextlib<<< 22225 1726882748.10725: stdout chunk (state=3): >>> <<< 22225 1726882748.10774: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize<<< 22225 1726882748.10777: stdout chunk (state=3): >>> # destroy ansible.module_utils.six.moves.urllib_parse<<< 22225 1726882748.10784: stdout chunk (state=3): >>> # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser <<< 22225 1726882748.10830: stdout chunk (state=3): >>># destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external <<< 22225 1726882748.10837: stdout chunk (state=3): >>># destroy _imp <<< 22225 1726882748.10894: stdout chunk (state=3): >>># destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules<<< 22225 1726882748.10907: stdout chunk (state=3): >>> # destroy _frozen_importlib<<< 22225 1726882748.10917: stdout chunk (state=3): >>> <<< 22225 1726882748.11037: stdout chunk (state=3): >>># destroy codecs<<< 22225 1726882748.11041: stdout chunk (state=3): >>> # destroy encodings.aliases # destroy encodings.utf_8<<< 22225 1726882748.11075: stdout chunk (state=3): >>> # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io<<< 22225 1726882748.11087: stdout chunk (state=3): >>> # destroy traceback # destroy warnings # destroy weakref # destroy collections<<< 22225 1726882748.11101: stdout chunk (state=3): >>> # destroy threading # destroy atexit # destroy _warnings<<< 22225 1726882748.11120: stdout chunk (state=3): >>> # destroy math # destroy _bisect # destroy time<<< 22225 1726882748.11164: stdout chunk (state=3): >>> # destroy _random<<< 22225 1726882748.11167: stdout chunk (state=3): >>> # destroy _weakref<<< 22225 1726882748.11173: stdout chunk (state=3): >>> <<< 22225 1726882748.11211: stdout chunk (state=3): >>># destroy _operator<<< 22225 1726882748.11226: stdout chunk (state=3): >>> # destroy _sha2 # destroy _sre # destroy _string<<< 22225 1726882748.11241: stdout chunk (state=3): >>> # destroy re # destroy itertools<<< 22225 1726882748.11263: stdout chunk (state=3): >>> <<< 22225 1726882748.11266: stdout chunk (state=3): >>># destroy _abc # destroy posix # destroy _functools # destroy builtins <<< 22225 1726882748.11301: stdout chunk (state=3): >>># destroy _thread # clear sys.audit hooks <<< 22225 1726882748.11865: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. <<< 22225 1726882748.11927: stderr chunk (state=3): >>><<< 22225 1726882748.11930: stdout chunk (state=3): >>><<< 22225 1726882748.12038: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f70050a8530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7005077b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f70050aaab0> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004e59190> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004e59fd0> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004e97da0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004e97fe0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004ecf7d0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004ecfe60> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004eafa70> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004ead190> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004e94f50> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004ef3770> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004ef2390> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004eae060> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004ef0bf0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004f20800> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004e941d0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7004f20cb0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004f20b60> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7004f20f50> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004e92cf0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004f21610> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004f212e0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004f22510> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004f3c710> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7004f3de50> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004f3ecf0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7004f3f350> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004f3e240> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7004f3fda0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004f3f4d0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004f22570> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7004c77d10> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7004ca0740> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004ca04a0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7004ca06b0> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7004ca0920> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004c75eb0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004ca1ee0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004ca0b60> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004f22c30> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004cce2a0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004ce63f0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004d23170> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004d45910> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004d23290> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004ce7050> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004b242c0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004ce5430> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004ca2e40> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f7004ce5550> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_mpz9hy8p/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004b89fa0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004b60e90> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004b27fe0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004b63e30> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7004bb9a60> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004bb97f0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004bb9100> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004bb95e0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004b8ac30> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7004bba7e0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7004bbaa20> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004bbaf60> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004a20d10> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7004a22930> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004a23290> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004a24470> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004a26f30> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7004a27290> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004a251f0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004a2af90> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004a29a60> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004a297c0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004a2bf50> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004a25700> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7004a6f140> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004a6f320> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7004a70e00> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004a70bc0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7004a73320> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004a714f0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004a7ea50> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004a73410> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7004a7fcb0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7004a7f9e0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7004a7fc80> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004a6f530> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7004a832c0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7004a847a0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004a81a60> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7004a82de0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004a816d0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f700490c8f0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f700490d670> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004a871a0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f700490d6a0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f700490e300> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7004916300> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7004916c00> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f700490f3e0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7004915850> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004916d20> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f70049aaf60> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004923e00> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f700491adb0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f700491ac00> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f70049b1e20> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7003f14620> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7003f149e0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004991670> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7004990a70> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f70049b0500> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f70049b08f0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7003f178c0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7003f17170> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7003f17350> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7003f165d0> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7003f179e0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7003f7e510> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7003f7c530> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f70049b15e0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7003f7fbc0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7003f7f2f0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7003fae630> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7003f9a360> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7003dc6570> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7003dc6180> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7003deeed0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7003ded9a0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7003deef90> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7003e35910> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7003e36660> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7003fa4dd0> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7003fa48c0> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDgEY5v0GEtCYtch7tJojQQAk4PbH2olIXeGHTQIm8P5pFfIFYEIOJVnRYq1iTcsBmLCTgtGM0uASSyO5gXiTJv1YA3W6bzq+KxIdoX/yvSXE7c8N6e/7sMCf9vq2o8xdS3RQTFVoQhj8zkjID057q3vE1D1ocBhYrbVTg76i1ZqUd+ePIBmv/FpJM5vb0YoL7gkfS3SFIJPuVIEqIfwZenELkhPa6MfTG3y/T8+Y4mRwbn7AmsxfBpApbj+TjvOT1vEjw0nBgVhen2pB+dpX/dtpPZiqrQgfcCF+gcf5MY2k2QbuXKKc1iESrpq3sm4as3n9bYr/2i3c3+5PTFN/CKnMpbejvhd+srQgr8UXt57pXIrXxBNe9thnfdeyp9mTxpprGooVC/CHvQ11TbipHGql4+nW9TSIg4s+WIoJGBb5REpn5hh2HmL0/W+Bhet//pxt9ENJxKyunBZToUbqQ2wjkR25JMThiZ6lKuzIRlIAK4i02pPGNUi28QgUDqdR8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKwVnf2m/NbfCFFK6wM50dwDuEJIaiTFh2d8mqI8HYkr65RHvztqJ3ibHa48thfVy5T7cZ8XqhpqkqfQd1OIshs=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIGUhLLUPg0qNUe+aRCNNpHWNDNP2CUtjsgQPqFujAjMC", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.10.9-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Sun Sep 8 17:23:55 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "ip-10-31-15-7.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-15-7", "ansible_nodename": "ip-10-31-15-7.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22da150943bd63960f74a27c08f190", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fibre_channel_wwn": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_is_chroot": false, "ansible_apparmor": {"status": "disabled"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_local": {}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_iscsi_iqn": "", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:b5954bb9-e972-4b2a-94f1-a82c77e96f77", "ansible_loadavg": {"1m": 0.75146484375, "5m": 0.6259765625, "15m": 0.375}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.180 60558 10.31.15.7 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.180 60558 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "39", "second": "07", "epoch": "1726882747", "epoch_int": "1726882747", "date": "2024-09-20", "time": "21:39:07", "iso8601_micro": "2024-09-21T01:39:07.652832Z", "iso8601": "2024-09-21T01:39:07Z", "iso8601_basic": "20240920T213907652832", "iso8601_basic_short": "20240920T213907", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fips": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_lsb": {}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:c5:8e:44:af", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.7", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:c5ff:fe8e:44af", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.7", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:c5:8e:44:af", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.7"], "ansible_all_ipv6_addresses": ["fe80::8ff:c5ff:fe8e:44af"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.7", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:c5ff:fe8e:44af"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3043, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 673, "free": 3043}, "nocache": {"free": 3456, "used": 260}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22da15-0943-bd63-960f-74a27c08f190", "ansible_product_uuid": "ec22da15-0943-bd63-960f-74a27c08f190", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 706, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251373637632, "block_size": 4096, "block_total": 64483404, "block_available": 61370517, "block_used": 3112887, "inode_total": 16384000, "inode_available": 16303047, "inode_used": 80953, "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"}], "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed_node1 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 22225 1726882748.14007: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882744.7122989-22250-4465326278732/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22225 1726882748.14047: _low_level_execute_command(): starting 22225 1726882748.14052: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882744.7122989-22250-4465326278732/ > /dev/null 2>&1 && sleep 0' 22225 1726882748.14704: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22225 1726882748.14715: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882748.14730: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882748.14812: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882748.14816: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 <<< 22225 1726882748.14818: stderr chunk (state=3): >>>debug2: match not found <<< 22225 1726882748.14820: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882748.14829: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 22225 1726882748.14831: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.7 is address <<< 22225 1726882748.14834: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 22225 1726882748.14837: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882748.14839: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882748.14893: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882748.14921: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882748.14926: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882748.15018: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882748.18169: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882748.18173: stdout chunk (state=3): >>><<< 22225 1726882748.18175: stderr chunk (state=3): >>><<< 22225 1726882748.18178: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882748.18277: handler run complete 22225 1726882748.18544: variable 'ansible_facts' from source: unknown 22225 1726882748.18656: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882748.19155: variable 'ansible_facts' from source: unknown 22225 1726882748.19269: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882748.19441: attempt loop complete, returning result 22225 1726882748.19451: _execute() done 22225 1726882748.19462: dumping result to json 22225 1726882748.19505: done dumping result, returning 22225 1726882748.19518: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0affc7ec-ae25-ec05-55b7-0000000000b9] 22225 1726882748.19529: sending task result for task 0affc7ec-ae25-ec05-55b7-0000000000b9 22225 1726882748.20409: done sending task result for task 0affc7ec-ae25-ec05-55b7-0000000000b9 22225 1726882748.20412: WORKER PROCESS EXITING ok: [managed_node1] 22225 1726882748.21401: no more pending results, returning what we have 22225 1726882748.21404: results queue empty 22225 1726882748.21405: checking for any_errors_fatal 22225 1726882748.21407: done checking for any_errors_fatal 22225 1726882748.21407: checking for max_fail_percentage 22225 1726882748.21409: done checking for max_fail_percentage 22225 1726882748.21410: checking to see if all hosts have failed and the running result is not ok 22225 1726882748.21411: done checking to see if all hosts have failed 22225 1726882748.21412: getting the remaining hosts for this loop 22225 1726882748.21413: done getting the remaining hosts for this loop 22225 1726882748.21417: getting the next task for host managed_node1 22225 1726882748.21659: done getting next task for host managed_node1 22225 1726882748.21663: ^ task is: TASK: meta (flush_handlers) 22225 1726882748.21665: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882748.21670: getting variables 22225 1726882748.21672: in VariableManager get_vars() 22225 1726882748.21696: Calling all_inventory to load vars for managed_node1 22225 1726882748.21699: Calling groups_inventory to load vars for managed_node1 22225 1726882748.21703: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882748.21714: Calling all_plugins_play to load vars for managed_node1 22225 1726882748.21717: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882748.21720: Calling groups_plugins_play to load vars for managed_node1 22225 1726882748.22452: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882748.23141: done with get_vars() 22225 1726882748.23156: done getting variables 22225 1726882748.23229: in VariableManager get_vars() 22225 1726882748.23324: Calling all_inventory to load vars for managed_node1 22225 1726882748.23327: Calling groups_inventory to load vars for managed_node1 22225 1726882748.23330: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882748.23336: Calling all_plugins_play to load vars for managed_node1 22225 1726882748.23339: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882748.23342: Calling groups_plugins_play to load vars for managed_node1 22225 1726882748.23664: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882748.24467: done with get_vars() 22225 1726882748.24486: done queuing things up, now waiting for results queue to drain 22225 1726882748.24488: results queue empty 22225 1726882748.24489: checking for any_errors_fatal 22225 1726882748.24492: done checking for any_errors_fatal 22225 1726882748.24493: checking for max_fail_percentage 22225 1726882748.24494: done checking for max_fail_percentage 22225 1726882748.24495: checking to see if all hosts have failed and the running result is not ok 22225 1726882748.24495: done checking to see if all hosts have failed 22225 1726882748.24496: getting the remaining hosts for this loop 22225 1726882748.24497: done getting the remaining hosts for this loop 22225 1726882748.24500: getting the next task for host managed_node1 22225 1726882748.24505: done getting next task for host managed_node1 22225 1726882748.24508: ^ task is: TASK: Include the task 'el_repo_setup.yml' 22225 1726882748.24510: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882748.24512: getting variables 22225 1726882748.24513: in VariableManager get_vars() 22225 1726882748.24850: Calling all_inventory to load vars for managed_node1 22225 1726882748.24854: Calling groups_inventory to load vars for managed_node1 22225 1726882748.24856: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882748.24863: Calling all_plugins_play to load vars for managed_node1 22225 1726882748.24865: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882748.24869: Calling groups_plugins_play to load vars for managed_node1 22225 1726882748.25266: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882748.26145: done with get_vars() 22225 1726882748.26236: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_nm.yml:11 Friday 20 September 2024 21:39:08 -0400 (0:00:03.633) 0:00:03.655 ****** 22225 1726882748.26629: entering _queue_task() for managed_node1/include_tasks 22225 1726882748.26632: Creating lock for include_tasks 22225 1726882748.27381: worker is 1 (out of 1 available) 22225 1726882748.27647: exiting _queue_task() for managed_node1/include_tasks 22225 1726882748.27656: done queuing things up, now waiting for results queue to drain 22225 1726882748.27658: waiting for pending results... 22225 1726882748.28265: running TaskExecutor() for managed_node1/TASK: Include the task 'el_repo_setup.yml' 22225 1726882748.28634: in run() - task 0affc7ec-ae25-ec05-55b7-000000000006 22225 1726882748.28651: variable 'ansible_search_path' from source: unknown 22225 1726882748.28698: calling self._execute() 22225 1726882748.28830: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882748.28835: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882748.28839: variable 'omit' from source: magic vars 22225 1726882748.29635: _execute() done 22225 1726882748.29640: dumping result to json 22225 1726882748.29643: done dumping result, returning 22225 1726882748.29886: done running TaskExecutor() for managed_node1/TASK: Include the task 'el_repo_setup.yml' [0affc7ec-ae25-ec05-55b7-000000000006] 22225 1726882748.29889: sending task result for task 0affc7ec-ae25-ec05-55b7-000000000006 22225 1726882748.29968: done sending task result for task 0affc7ec-ae25-ec05-55b7-000000000006 22225 1726882748.29971: WORKER PROCESS EXITING 22225 1726882748.30037: no more pending results, returning what we have 22225 1726882748.30044: in VariableManager get_vars() 22225 1726882748.30084: Calling all_inventory to load vars for managed_node1 22225 1726882748.30087: Calling groups_inventory to load vars for managed_node1 22225 1726882748.30092: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882748.30108: Calling all_plugins_play to load vars for managed_node1 22225 1726882748.30110: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882748.30114: Calling groups_plugins_play to load vars for managed_node1 22225 1726882748.30812: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882748.31247: done with get_vars() 22225 1726882748.31259: variable 'ansible_search_path' from source: unknown 22225 1726882748.31278: we have included files to process 22225 1726882748.31280: generating all_blocks data 22225 1726882748.31281: done generating all_blocks data 22225 1726882748.31282: processing included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 22225 1726882748.31283: loading included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 22225 1726882748.31287: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 22225 1726882748.33155: in VariableManager get_vars() 22225 1726882748.33176: done with get_vars() 22225 1726882748.33189: done processing included file 22225 1726882748.33191: iterating over new_blocks loaded from include file 22225 1726882748.33193: in VariableManager get_vars() 22225 1726882748.33203: done with get_vars() 22225 1726882748.33205: filtering new block on tags 22225 1726882748.33223: done filtering new block on tags 22225 1726882748.33429: in VariableManager get_vars() 22225 1726882748.33441: done with get_vars() 22225 1726882748.33442: filtering new block on tags 22225 1726882748.33460: done filtering new block on tags 22225 1726882748.33462: in VariableManager get_vars() 22225 1726882748.33472: done with get_vars() 22225 1726882748.33473: filtering new block on tags 22225 1726882748.33486: done filtering new block on tags 22225 1726882748.33488: done iterating over new_blocks loaded from include file included: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed_node1 22225 1726882748.33494: extending task lists for all hosts with included blocks 22225 1726882748.33548: done extending task lists 22225 1726882748.33549: done processing included files 22225 1726882748.33550: results queue empty 22225 1726882748.33551: checking for any_errors_fatal 22225 1726882748.33552: done checking for any_errors_fatal 22225 1726882748.33553: checking for max_fail_percentage 22225 1726882748.33554: done checking for max_fail_percentage 22225 1726882748.33555: checking to see if all hosts have failed and the running result is not ok 22225 1726882748.33556: done checking to see if all hosts have failed 22225 1726882748.33557: getting the remaining hosts for this loop 22225 1726882748.33558: done getting the remaining hosts for this loop 22225 1726882748.33561: getting the next task for host managed_node1 22225 1726882748.33565: done getting next task for host managed_node1 22225 1726882748.33567: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 22225 1726882748.33569: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882748.33572: getting variables 22225 1726882748.33573: in VariableManager get_vars() 22225 1726882748.33582: Calling all_inventory to load vars for managed_node1 22225 1726882748.33584: Calling groups_inventory to load vars for managed_node1 22225 1726882748.33587: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882748.33593: Calling all_plugins_play to load vars for managed_node1 22225 1726882748.33595: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882748.33598: Calling groups_plugins_play to load vars for managed_node1 22225 1726882748.33986: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882748.34413: done with get_vars() 22225 1726882748.34626: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Friday 20 September 2024 21:39:08 -0400 (0:00:00.083) 0:00:03.739 ****** 22225 1726882748.34699: entering _queue_task() for managed_node1/setup 22225 1726882748.35470: worker is 1 (out of 1 available) 22225 1726882748.35482: exiting _queue_task() for managed_node1/setup 22225 1726882748.35493: done queuing things up, now waiting for results queue to drain 22225 1726882748.35494: waiting for pending results... 22225 1726882748.36042: running TaskExecutor() for managed_node1/TASK: Gather the minimum subset of ansible_facts required by the network role test 22225 1726882748.36047: in run() - task 0affc7ec-ae25-ec05-55b7-0000000000ca 22225 1726882748.36050: variable 'ansible_search_path' from source: unknown 22225 1726882748.36275: variable 'ansible_search_path' from source: unknown 22225 1726882748.36279: calling self._execute() 22225 1726882748.36345: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882748.36392: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882748.36410: variable 'omit' from source: magic vars 22225 1726882748.37578: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22225 1726882748.42633: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22225 1726882748.42716: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22225 1726882748.42770: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22225 1726882748.42979: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22225 1726882748.43012: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22225 1726882748.43216: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22225 1726882748.43257: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22225 1726882748.43339: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22225 1726882748.43389: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22225 1726882748.43629: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22225 1726882748.43866: variable 'ansible_facts' from source: unknown 22225 1726882748.44102: variable 'network_test_required_facts' from source: task vars 22225 1726882748.44152: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 22225 1726882748.44165: variable 'omit' from source: magic vars 22225 1726882748.44265: variable 'omit' from source: magic vars 22225 1726882748.44560: variable 'omit' from source: magic vars 22225 1726882748.44728: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22225 1726882748.44732: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22225 1726882748.44735: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22225 1726882748.44737: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882748.44740: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882748.44742: variable 'inventory_hostname' from source: host vars for 'managed_node1' 22225 1726882748.44745: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882748.44747: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882748.44846: Set connection var ansible_connection to ssh 22225 1726882748.44941: Set connection var ansible_pipelining to False 22225 1726882748.44955: Set connection var ansible_shell_executable to /bin/sh 22225 1726882748.44965: Set connection var ansible_timeout to 10 22225 1726882748.44997: Set connection var ansible_shell_type to sh 22225 1726882748.45007: Set connection var ansible_module_compression to ZIP_DEFLATED 22225 1726882748.45044: variable 'ansible_shell_executable' from source: unknown 22225 1726882748.45108: variable 'ansible_connection' from source: unknown 22225 1726882748.45117: variable 'ansible_module_compression' from source: unknown 22225 1726882748.45127: variable 'ansible_shell_type' from source: unknown 22225 1726882748.45134: variable 'ansible_shell_executable' from source: unknown 22225 1726882748.45141: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882748.45148: variable 'ansible_pipelining' from source: unknown 22225 1726882748.45216: variable 'ansible_timeout' from source: unknown 22225 1726882748.45227: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882748.45507: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 22225 1726882748.45554: variable 'omit' from source: magic vars 22225 1726882748.45649: starting attempt loop 22225 1726882748.45652: running the handler 22225 1726882748.45672: _low_level_execute_command(): starting 22225 1726882748.45684: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22225 1726882748.47191: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22225 1726882748.47405: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882748.47514: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882748.47695: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882748.49536: stdout chunk (state=3): >>>/root <<< 22225 1726882748.49796: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882748.49838: stderr chunk (state=3): >>><<< 22225 1726882748.49900: stdout chunk (state=3): >>><<< 22225 1726882748.50107: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882748.50118: _low_level_execute_command(): starting 22225 1726882748.50121: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882748.5000541-22387-49479861508741 `" && echo ansible-tmp-1726882748.5000541-22387-49479861508741="` echo /root/.ansible/tmp/ansible-tmp-1726882748.5000541-22387-49479861508741 `" ) && sleep 0' 22225 1726882748.51370: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22225 1726882748.51404: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882748.51612: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882748.51652: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882748.51841: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882748.51931: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882748.54366: stdout chunk (state=3): >>>ansible-tmp-1726882748.5000541-22387-49479861508741=/root/.ansible/tmp/ansible-tmp-1726882748.5000541-22387-49479861508741 <<< 22225 1726882748.54572: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882748.54815: stderr chunk (state=3): >>><<< 22225 1726882748.54843: stdout chunk (state=3): >>><<< 22225 1726882748.54864: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882748.5000541-22387-49479861508741=/root/.ansible/tmp/ansible-tmp-1726882748.5000541-22387-49479861508741 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882748.55010: variable 'ansible_module_compression' from source: unknown 22225 1726882748.55014: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-2222502xecypi/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 22225 1726882748.55071: variable 'ansible_facts' from source: unknown 22225 1726882748.55303: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882748.5000541-22387-49479861508741/AnsiballZ_setup.py 22225 1726882748.55465: Sending initial data 22225 1726882748.55596: Sent initial data (153 bytes) 22225 1726882748.56531: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22225 1726882748.56550: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882748.56624: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882748.56642: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882748.56712: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882748.56720: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882748.56724: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882748.56787: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882748.59212: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22225 1726882748.59367: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22225 1726882748.59426: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2222502xecypi/tmpqw3zyo97 /root/.ansible/tmp/ansible-tmp-1726882748.5000541-22387-49479861508741/AnsiballZ_setup.py <<< 22225 1726882748.59431: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882748.5000541-22387-49479861508741/AnsiballZ_setup.py" <<< 22225 1726882748.59472: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-2222502xecypi/tmpqw3zyo97" to remote "/root/.ansible/tmp/ansible-tmp-1726882748.5000541-22387-49479861508741/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882748.5000541-22387-49479861508741/AnsiballZ_setup.py" <<< 22225 1726882748.60656: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882748.60826: stderr chunk (state=3): >>><<< 22225 1726882748.60830: stdout chunk (state=3): >>><<< 22225 1726882748.60832: done transferring module to remote 22225 1726882748.60834: _low_level_execute_command(): starting 22225 1726882748.60837: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882748.5000541-22387-49479861508741/ /root/.ansible/tmp/ansible-tmp-1726882748.5000541-22387-49479861508741/AnsiballZ_setup.py && sleep 0' 22225 1726882748.61427: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882748.61468: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882748.61602: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882748.61640: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882748.61809: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882748.64386: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882748.64464: stderr chunk (state=3): >>><<< 22225 1726882748.64468: stdout chunk (state=3): >>><<< 22225 1726882748.64482: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882748.64487: _low_level_execute_command(): starting 22225 1726882748.64493: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882748.5000541-22387-49479861508741/AnsiballZ_setup.py && sleep 0' 22225 1726882748.65286: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found <<< 22225 1726882748.65302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882748.65396: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882748.65418: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882748.65454: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882748.65611: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882748.69011: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 22225 1726882748.69074: stdout chunk (state=3): >>>import _imp # builtin <<< 22225 1726882748.69112: stdout chunk (state=3): >>>import '_thread' # <<< 22225 1726882748.69138: stdout chunk (state=3): >>>import '_warnings' # <<< 22225 1726882748.69150: stdout chunk (state=3): >>>import '_weakref' # <<< 22225 1726882748.69249: stdout chunk (state=3): >>>import '_io' # <<< 22225 1726882748.69278: stdout chunk (state=3): >>>import 'marshal' # <<< 22225 1726882748.69331: stdout chunk (state=3): >>>import 'posix' # <<< 22225 1726882748.69429: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 22225 1726882748.69494: stdout chunk (state=3): >>># installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook <<< 22225 1726882748.69706: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # <<< 22225 1726882748.69752: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 22225 1726882748.69834: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51ec0530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51e8fb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51ec2ab0> import '_signal' # <<< 22225 1726882748.69873: stdout chunk (state=3): >>>import '_abc' # <<< 22225 1726882748.69902: stdout chunk (state=3): >>>import 'abc' # <<< 22225 1726882748.69967: stdout chunk (state=3): >>>import 'io' # import '_stat' # <<< 22225 1726882748.70010: stdout chunk (state=3): >>>import 'stat' # <<< 22225 1726882748.70135: stdout chunk (state=3): >>>import '_collections_abc' # <<< 22225 1726882748.70176: stdout chunk (state=3): >>>import 'genericpath' # <<< 22225 1726882748.70229: stdout chunk (state=3): >>>import 'posixpath' # <<< 22225 1726882748.70257: stdout chunk (state=3): >>>import 'os' # <<< 22225 1726882748.70290: stdout chunk (state=3): >>>import '_sitebuiltins' # Processing user site-packages <<< 22225 1726882748.70426: stdout chunk (state=3): >>>Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51c71190> <<< 22225 1726882748.70481: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 22225 1726882748.70504: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 22225 1726882748.70529: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51c71fd0> <<< 22225 1726882748.70574: stdout chunk (state=3): >>>import 'site' # <<< 22225 1726882748.70622: stdout chunk (state=3): >>>Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 22225 1726882748.71344: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 22225 1726882748.71365: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 22225 1726882748.71403: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 22225 1726882748.71424: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 22225 1726882748.71464: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 22225 1726882748.71525: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 22225 1726882748.71558: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 22225 1726882748.71619: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51cafe60> <<< 22225 1726882748.71644: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 22225 1726882748.71719: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51caff20> <<< 22225 1726882748.71741: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 22225 1726882748.71785: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 22225 1726882748.71828: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 22225 1726882748.71903: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 22225 1726882748.72130: stdout chunk (state=3): >>>import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51ce7890> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51ce7f20> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51cc7b30> import '_functools' # <<< 22225 1726882748.72146: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51cc5250> <<< 22225 1726882748.72304: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51cad010> <<< 22225 1726882748.72346: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 22225 1726882748.72384: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 22225 1726882748.72414: stdout chunk (state=3): >>>import '_sre' # <<< 22225 1726882748.72445: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 22225 1726882748.72486: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 22225 1726882748.72524: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py <<< 22225 1726882748.72536: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 22225 1726882748.72591: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51d0b800> <<< 22225 1726882748.72619: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51d0a420> <<< 22225 1726882748.72661: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py <<< 22225 1726882748.72666: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' <<< 22225 1726882748.72689: stdout chunk (state=3): >>>import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51cc6120> <<< 22225 1726882748.72697: stdout chunk (state=3): >>>import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51d08c50> <<< 22225 1726882748.72779: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 22225 1726882748.72797: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' <<< 22225 1726882748.72812: stdout chunk (state=3): >>>import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51d38830> <<< 22225 1726882748.72826: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51cac290> <<< 22225 1726882748.72926: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5f51d38ce0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51d38b90> <<< 22225 1726882748.72946: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 22225 1726882748.72971: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 22225 1726882748.72978: stdout chunk (state=3): >>>import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5f51d38f80> <<< 22225 1726882748.73002: stdout chunk (state=3): >>>import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51caadb0> <<< 22225 1726882748.73044: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py <<< 22225 1726882748.73060: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 22225 1726882748.73126: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 22225 1726882748.73143: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 22225 1726882748.73225: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51d39670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51d39340> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 22225 1726882748.73248: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51d3a570> <<< 22225 1726882748.73268: stdout chunk (state=3): >>>import 'importlib.util' # <<< 22225 1726882748.73297: stdout chunk (state=3): >>>import 'runpy' # <<< 22225 1726882748.73336: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 22225 1726882748.73393: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 22225 1726882748.73525: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51d547a0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5f51d55eb0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 22225 1726882748.73628: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51d56d50> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5f51d573b0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51d562a0> <<< 22225 1726882748.73658: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 22225 1726882748.73682: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 22225 1726882748.73742: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 22225 1726882748.73762: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5f51d57e30> <<< 22225 1726882748.73794: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51d57560> <<< 22225 1726882748.73858: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51d3a5d0> <<< 22225 1726882748.73899: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 22225 1726882748.74127: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5f51a57cb0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5f51a80710> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51a80470> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5f51a80650> <<< 22225 1726882748.74152: stdout chunk (state=3): >>># extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' <<< 22225 1726882748.74163: stdout chunk (state=3): >>># extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' <<< 22225 1726882748.74174: stdout chunk (state=3): >>>import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5f51a808c0> <<< 22225 1726882748.74201: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51a55e50> <<< 22225 1726882748.74238: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 22225 1726882748.74392: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 22225 1726882748.74432: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 22225 1726882748.74446: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 22225 1726882748.74470: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51a81f40> <<< 22225 1726882748.74509: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51a80bf0> <<< 22225 1726882748.74545: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51d3acc0> <<< 22225 1726882748.74588: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 22225 1726882748.74670: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 22225 1726882748.74706: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 22225 1726882748.74773: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 22225 1726882748.74824: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51aae300> <<< 22225 1726882748.74893: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 22225 1726882748.74932: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 22225 1726882748.74968: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 22225 1726882748.75020: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 22225 1726882748.75229: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51ac6480> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 22225 1726882748.75261: stdout chunk (state=3): >>>import 'ntpath' # <<< 22225 1726882748.75305: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc'<<< 22225 1726882748.75320: stdout chunk (state=3): >>> <<< 22225 1726882748.75325: stdout chunk (state=3): >>>import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51b03260> <<< 22225 1726882748.75360: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 22225 1726882748.75422: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 22225 1726882748.75456: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 22225 1726882748.75530: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 22225 1726882748.75674: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51b25a00> <<< 22225 1726882748.75800: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51b03380> <<< 22225 1726882748.75869: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51ac7110> <<< 22225 1726882748.75915: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py <<< 22225 1726882748.75919: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' <<< 22225 1726882748.75941: stdout chunk (state=3): >>>import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51944410> <<< 22225 1726882748.76020: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51ac54c0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51a82ea0> <<< 22225 1726882748.76258: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 22225 1726882748.76293: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f5f51944680> <<< 22225 1726882748.76597: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_56kyque2/ansible_setup_payload.zip' <<< 22225 1726882748.76603: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882748.76925: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 22225 1726882748.76979: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 22225 1726882748.77106: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 22225 1726882748.77148: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py <<< 22225 1726882748.77157: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' <<< 22225 1726882748.77169: stdout chunk (state=3): >>>import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f519b20f0> <<< 22225 1726882748.77225: stdout chunk (state=3): >>>import '_typing' # <<< 22225 1726882748.77549: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51988fe0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51988170> # zipimport: zlib available import 'ansible' # <<< 22225 1726882748.77559: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882748.77632: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # <<< 22225 1726882748.77641: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882748.80238: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882748.82086: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f5198bf80> <<< 22225 1726882748.82147: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 22225 1726882748.82196: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 22225 1726882748.82201: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5f519ddb50> <<< 22225 1726882748.82248: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f519dd8e0> <<< 22225 1726882748.82281: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f519dd1f0> <<< 22225 1726882748.82313: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 22225 1726882748.82330: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 22225 1726882748.82373: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f519dd640> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f519b2d80> import 'atexit' # <<< 22225 1726882748.82397: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5f519de8d0> <<< 22225 1726882748.82425: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' <<< 22225 1726882748.82466: stdout chunk (state=3): >>># extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5f519deb10> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 22225 1726882748.82512: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 22225 1726882748.82525: stdout chunk (state=3): >>>import '_locale' # <<< 22225 1726882748.82594: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f519defc0> import 'pwd' # <<< 22225 1726882748.82611: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 22225 1726882748.82633: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 22225 1726882748.82681: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51848da0> <<< 22225 1726882748.82736: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5f5184a9c0> <<< 22225 1726882748.82739: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 22225 1726882748.82759: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 22225 1726882748.82789: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f5184b2c0> <<< 22225 1726882748.82828: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 22225 1726882748.82874: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 22225 1726882748.82879: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f5184c4a0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 22225 1726882748.82948: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 22225 1726882748.83005: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f5184ef60> <<< 22225 1726882748.83058: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5f5184f050> <<< 22225 1726882748.83083: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f5184d220> <<< 22225 1726882748.83096: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 22225 1726882748.83140: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 22225 1726882748.83173: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 22225 1726882748.83187: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 22225 1726882748.83206: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 22225 1726882748.83258: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 22225 1726882748.83263: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51852db0> <<< 22225 1726882748.83285: stdout chunk (state=3): >>>import '_tokenize' # <<< 22225 1726882748.83346: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f518518b0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51851610> <<< 22225 1726882748.83367: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 22225 1726882748.83450: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51853ce0> <<< 22225 1726882748.83499: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f5184d730> <<< 22225 1726882748.83519: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5f51896ed0> <<< 22225 1726882748.83548: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f518970e0> <<< 22225 1726882748.83568: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 22225 1726882748.83598: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 22225 1726882748.83627: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 22225 1726882748.83680: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5f51898bf0> <<< 22225 1726882748.83689: stdout chunk (state=3): >>>import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f518989b0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 22225 1726882748.83812: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 22225 1726882748.83869: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5f5189b170> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f518992e0> <<< 22225 1726882748.83897: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 22225 1726882748.83974: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 22225 1726882748.83987: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 22225 1726882748.84039: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f518a6960> <<< 22225 1726882748.84178: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f5189b2f0> <<< 22225 1726882748.84252: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5f518a7770> <<< 22225 1726882748.84295: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' <<< 22225 1726882748.84330: stdout chunk (state=3): >>># extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5f518a79e0> <<< 22225 1726882748.84365: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5f518a7380> <<< 22225 1726882748.84369: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f518972f0> <<< 22225 1726882748.84403: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 22225 1726882748.84406: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 22225 1726882748.84445: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 22225 1726882748.84461: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 22225 1726882748.84486: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5f518ab470> <<< 22225 1726882748.84675: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5f518ac860> <<< 22225 1726882748.84725: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f518a9be0> <<< 22225 1726882748.84728: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5f518aaf90> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f518a97f0> <<< 22225 1726882748.84766: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # <<< 22225 1726882748.84791: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882748.84877: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882748.84999: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882748.85005: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882748.85033: stdout chunk (state=3): >>>import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 22225 1726882748.85051: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882748.85189: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882748.85309: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882748.86111: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882748.87160: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # <<< 22225 1726882748.87169: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 22225 1726882748.87199: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 22225 1726882748.87257: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 22225 1726882748.87297: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' <<< 22225 1726882748.87302: stdout chunk (state=3): >>># extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5f517349b0> <<< 22225 1726882748.87435: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py <<< 22225 1726882748.87444: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 22225 1726882748.87464: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51735760> <<< 22225 1726882748.87480: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f518af2f0> <<< 22225 1726882748.87542: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 22225 1726882748.87560: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882748.87603: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils._text' # <<< 22225 1726882748.87619: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882748.87876: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882748.88144: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 22225 1726882748.88166: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f517354c0> <<< 22225 1726882748.88186: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882748.88811: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882748.89317: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882748.89396: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882748.89636: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 22225 1726882748.89743: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882748.89908: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # # zipimport: zlib available <<< 22225 1726882748.89928: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882748.89947: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 22225 1726882748.90011: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882748.90062: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 22225 1726882748.90163: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882748.90515: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882748.90953: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 22225 1726882748.91045: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 22225 1726882748.91052: stdout chunk (state=3): >>>import '_ast' # <<< 22225 1726882748.91159: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51737ce0> <<< 22225 1726882748.91188: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882748.91298: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882748.91418: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 22225 1726882748.91433: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 22225 1726882748.91445: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 22225 1726882748.91473: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 22225 1726882748.91625: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 22225 1726882748.91777: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5f5173e300> <<< 22225 1726882748.91846: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5f5173ec60> <<< 22225 1726882748.91865: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f518acaa0> <<< 22225 1726882748.91923: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882748.91956: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882748.92011: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 22225 1726882748.92032: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882748.92097: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882748.92204: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882748.92253: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882748.92373: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 22225 1726882748.92440: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 22225 1726882748.92881: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5f5173d8e0> <<< 22225 1726882748.92886: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f5173eed0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 22225 1726882748.92893: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 22225 1726882748.92908: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 22225 1726882748.92973: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 22225 1726882748.92991: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 22225 1726882748.93091: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f517cef60> <<< 22225 1726882748.93387: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f5174bdd0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51746e10> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51746c60> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 22225 1726882748.93449: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882748.93630: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 22225 1726882748.93833: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available <<< 22225 1726882748.93837: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882748.93914: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882748.93934: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882748.93971: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # <<< 22225 1726882748.94075: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882748.94187: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882748.94380: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882748.94545: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 22225 1726882748.94548: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 22225 1726882748.94590: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 22225 1726882748.94604: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f517d5d60> <<< 22225 1726882748.94627: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 22225 1726882748.94641: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 22225 1726882748.94709: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 22225 1726882748.94815: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 22225 1726882748.94819: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f50d44500> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5f50d44860> <<< 22225 1726882748.94863: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f517b5580> <<< 22225 1726882748.94878: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f517b4800> <<< 22225 1726882748.94953: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f517d4440> <<< 22225 1726882748.94965: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f517d4dd0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 22225 1726882748.95033: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 22225 1726882748.95047: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 22225 1726882748.95144: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5f50d47860> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f50d47140> <<< 22225 1726882748.95165: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5f50d472f0> <<< 22225 1726882748.95184: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f50d46570> <<< 22225 1726882748.95202: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 22225 1726882748.95312: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 22225 1726882748.95348: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f50d479b0> <<< 22225 1726882748.95363: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 22225 1726882748.95437: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5f50dae4e0> <<< 22225 1726882748.95645: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f50dac500> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f517d54f0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available <<< 22225 1726882748.95663: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 22225 1726882748.95707: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882748.95806: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # <<< 22225 1726882748.95830: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882748.95846: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system' # <<< 22225 1726882748.95849: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882748.95938: stdout chunk (state=3): >>># zipimport: zlib available<<< 22225 1726882748.95941: stdout chunk (state=3): >>> <<< 22225 1726882748.95943: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available <<< 22225 1726882748.95981: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882748.96033: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # <<< 22225 1726882748.96114: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882748.96142: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882748.96145: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 22225 1726882748.96158: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882748.96210: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882748.96274: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882748.96340: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882748.96402: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # <<< 22225 1726882748.96449: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882748.96994: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882748.97463: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 22225 1726882748.97470: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882748.97659: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 22225 1726882748.97663: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # <<< 22225 1726882748.97675: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882748.97701: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882748.97734: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 22225 1726882748.97746: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882748.97887: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available <<< 22225 1726882748.97919: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882748.97945: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 22225 1726882748.97957: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882748.97995: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882748.98026: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 22225 1726882748.98102: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882748.98246: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f50dae7b0> <<< 22225 1726882748.98267: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 22225 1726882748.98321: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 22225 1726882748.98441: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f50daf440> import 'ansible.module_utils.facts.system.local' # <<< 22225 1726882748.98444: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882748.98517: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882748.98643: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available <<< 22225 1726882748.98700: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882748.98792: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 22225 1726882748.98804: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882748.98876: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882748.99046: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available <<< 22225 1726882748.99064: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 22225 1726882748.99109: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 22225 1726882748.99186: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 22225 1726882748.99262: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5f50de2780> <<< 22225 1726882748.99487: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f50dca480> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available <<< 22225 1726882748.99546: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882748.99607: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 22225 1726882748.99619: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882748.99706: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882748.99817: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882748.99929: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882749.00246: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available <<< 22225 1726882749.00343: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 22225 1726882749.00368: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5f50dfe1e0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f50dffb60> import 'ansible.module_utils.facts.system.user' # <<< 22225 1726882749.00392: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882749.00395: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882749.00405: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware' # <<< 22225 1726882749.00474: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 22225 1726882749.00537: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available <<< 22225 1726882749.00688: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882749.00860: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 22225 1726882749.00863: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882749.01002: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882749.01144: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882749.01148: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882749.01249: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available <<< 22225 1726882749.01256: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882749.01406: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882749.01767: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available <<< 22225 1726882749.01862: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 22225 1726882749.01875: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882749.01900: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882749.01937: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882749.02554: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882749.03200: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available <<< 22225 1726882749.03480: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available <<< 22225 1726882749.03495: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882749.03643: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available <<< 22225 1726882749.03780: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882749.03961: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 22225 1726882749.03969: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882749.03981: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 22225 1726882749.04010: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882749.04050: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882749.04092: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 22225 1726882749.04242: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 22225 1726882749.04343: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882749.04558: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882749.04781: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 22225 1726882749.04795: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882749.04891: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available <<< 22225 1726882749.04996: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available <<< 22225 1726882749.05047: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882749.05117: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available <<< 22225 1726882749.05141: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882749.05212: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available <<< 22225 1726882749.05326: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available <<< 22225 1726882749.05377: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882749.05473: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available <<< 22225 1726882749.05758: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882749.06042: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available <<< 22225 1726882749.06183: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # <<< 22225 1726882749.06406: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # <<< 22225 1726882749.06409: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882749.06419: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882749.06482: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available <<< 22225 1726882749.06619: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882749.06652: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 22225 1726882749.06761: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available <<< 22225 1726882749.06765: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882749.06881: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 22225 1726882749.06908: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882749.06964: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882749.07042: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882749.07210: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available <<< 22225 1726882749.07309: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 22225 1726882749.07485: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882749.07702: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 22225 1726882749.07725: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882749.07768: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882749.07811: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 22225 1726882749.07834: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882749.07878: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882749.07928: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 22225 1726882749.07945: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882749.08034: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882749.08131: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # <<< 22225 1726882749.08148: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882749.08244: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882749.08346: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # <<< 22225 1726882749.08349: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 22225 1726882749.08438: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882749.09037: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py <<< 22225 1726882749.09050: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 22225 1726882749.09098: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5f50bab8f0> <<< 22225 1726882749.09114: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f50ba8650> <<< 22225 1726882749.09197: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f50ba83b0> <<< 22225 1726882750.39463: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_local": {}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDgEY5v0GEtCYtch7tJojQQAk4PbH2olIXeGHTQIm8P5pFfIFYEIOJVnRYq1iTcsBmLCTgtGM0uASSyO5gXiTJv1YA3W6bzq+KxIdoX/yvSXE7c8N6e/7sMCf9vq2o8xdS3RQTFVoQhj8zkjID057q3vE1D1ocBhYrbVTg76i1ZqUd+ePIBmv/FpJM5vb0YoL7gkfS3SFIJPuVIEqIfwZenELkhPa6MfTG3y/T8+Y4mRwbn7AmsxfBpApbj+TjvOT1vEjw0nBgVhen2pB+dpX/dtpPZiqrQgfcCF+gcf5MY2k2QbuXKKc1iESrpq3sm4as3n9bYr/2i3c3+5PTFN/CKnMpbejvhd+srQgr8UXt57pXIrXxBNe9thnfdeyp9mTxpprGooVC/CHvQ11TbipHGql4+nW9TSIg4s+WIoJGBb5REpn5hh2HmL0/W+Bhet//pxt9ENJxKyunBZToUbqQ2wjkR25JMThiZ6lKuzIRlIAK4i02pPGNUi28QgUDqdR8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKwVnf2m/NbfCFFK6wM50dwDuEJIaiTFh2d8mqI8HYkr65RHvztqJ3ibHa48thfVy5T7cZ8XqhpqkqfQd1OIshs=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIGUhLLUPg0qNUe+aRCNNpHWNDNP2CUtjsgQPqFujAjMC", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_apparmor": {"status": "disabled"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "39", "second": "09", "epoch": "1726882749", "epoch_int": "1726882749", "date": "2024-09-20", "time": "21:39:09", "iso8601_micro": "2024-09-21T01:39:09.085101Z", "iso8601": "2024-09-21T01:39:09Z", "iso8601_basic": "20240920T213909085101", "iso8601_basic_short": "20240920T213909", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_lsb": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system": "Linux", "ansible_kernel": "6.10.9-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Sun Sep 8 17:23:55 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "ip-10-31-15-7.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-15-7", "ansible_nodename": "ip-10-31-15-7.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22da150943bd63960f74a27c08f190", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.180 60558 10.31.15.7 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.180 60558 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible<<< 22225 1726882750.39509: stdout chunk (state=3): >>>_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_service_mgr": "systemd", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_fips": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_pkg_mgr": "dnf", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 22225 1726882750.40091: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value <<< 22225 1726882750.40114: stdout chunk (state=3): >>># clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal <<< 22225 1726882750.40129: stdout chunk (state=3): >>># cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs <<< 22225 1726882750.40134: stdout chunk (state=3): >>># cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site <<< 22225 1726882750.40169: stdout chunk (state=3): >>># cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct <<< 22225 1726882750.40177: stdout chunk (state=3): >>># cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib <<< 22225 1726882750.40188: stdout chunk (state=3): >>># cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset <<< 22225 1726882750.40214: stdout chunk (state=3): >>># cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible <<< 22225 1726882750.40237: stdout chunk (state=3): >>># destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd<<< 22225 1726882750.40248: stdout chunk (state=3): >>> # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex <<< 22225 1726882750.40267: stdout chunk (state=3): >>># cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd<<< 22225 1726882750.40280: stdout chunk (state=3): >>> # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid<<< 22225 1726882750.40289: stdout chunk (state=3): >>> # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader <<< 22225 1726882750.40309: stdout chunk (state=3): >>># cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon <<< 22225 1726882750.40335: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes <<< 22225 1726882750.40341: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing <<< 22225 1726882750.40372: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 22225 1726882750.40400: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline <<< 22225 1726882750.40432: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly <<< 22225 1726882750.40459: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl <<< 22225 1726882750.40477: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts <<< 22225 1726882750.40500: stdout chunk (state=3): >>># destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps <<< 22225 1726882750.40503: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env <<< 22225 1726882750.40530: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base<<< 22225 1726882750.40534: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 22225 1726882750.40885: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 22225 1726882750.40891: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 22225 1726882750.40914: stdout chunk (state=3): >>># destroy _bz2 <<< 22225 1726882750.40935: stdout chunk (state=3): >>># destroy _compression # destroy _lzma <<< 22225 1726882750.40938: stdout chunk (state=3): >>># destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 22225 1726882750.40967: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress <<< 22225 1726882750.41009: stdout chunk (state=3): >>># destroy ntpath <<< 22225 1726882750.41036: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib <<< 22225 1726882750.41052: stdout chunk (state=3): >>># destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json <<< 22225 1726882750.41073: stdout chunk (state=3): >>># destroy grp # destroy encodings <<< 22225 1726882750.41083: stdout chunk (state=3): >>># destroy _locale <<< 22225 1726882750.41086: stdout chunk (state=3): >>># destroy locale # destroy select # destroy _signal <<< 22225 1726882750.41110: stdout chunk (state=3): >>># destroy _posixsubprocess # destroy syslog # destroy uuid<<< 22225 1726882750.41116: stdout chunk (state=3): >>> <<< 22225 1726882750.41160: stdout chunk (state=3): >>># destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil <<< 22225 1726882750.41180: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 22225 1726882750.41235: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool <<< 22225 1726882750.41242: stdout chunk (state=3): >>># destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle <<< 22225 1726882750.41264: stdout chunk (state=3): >>># destroy _pickle <<< 22225 1726882750.41287: stdout chunk (state=3): >>># destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process <<< 22225 1726882750.41306: stdout chunk (state=3): >>># destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors <<< 22225 1726882750.41313: stdout chunk (state=3): >>># destroy _multiprocessing <<< 22225 1726882750.41342: stdout chunk (state=3): >>># destroy shlex # destroy fcntl <<< 22225 1726882750.41346: stdout chunk (state=3): >>># destroy datetime <<< 22225 1726882750.41363: stdout chunk (state=3): >>># destroy subprocess # destroy base64 <<< 22225 1726882750.41377: stdout chunk (state=3): >>># destroy _ssl <<< 22225 1726882750.41412: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux <<< 22225 1726882750.41415: stdout chunk (state=3): >>># destroy getpass # destroy pwd # destroy termios <<< 22225 1726882750.41425: stdout chunk (state=3): >>># destroy errno # destroy json <<< 22225 1726882750.41456: stdout chunk (state=3): >>># destroy socket # destroy struct <<< 22225 1726882750.41463: stdout chunk (state=3): >>># destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector <<< 22225 1726882750.41519: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux <<< 22225 1726882750.41546: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes <<< 22225 1726882750.41562: stdout chunk (state=3): >>># cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap <<< 22225 1726882750.41591: stdout chunk (state=3): >>># cleanup[3] wiping tokenize <<< 22225 1726882750.41598: stdout chunk (state=3): >>># cleanup[3] wiping _tokenize # cleanup[3] wiping platform <<< 22225 1726882750.41605: stdout chunk (state=3): >>># cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib<<< 22225 1726882750.41632: stdout chunk (state=3): >>> # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings <<< 22225 1726882750.41654: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler <<< 22225 1726882750.41664: stdout chunk (state=3): >>># destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre <<< 22225 1726882750.41701: stdout chunk (state=3): >>># cleanup[3] wiping functools <<< 22225 1726882750.41718: stdout chunk (state=3): >>># cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os <<< 22225 1726882750.41726: stdout chunk (state=3): >>># destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat <<< 22225 1726882750.41755: stdout chunk (state=3): >>># cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc <<< 22225 1726882750.41769: stdout chunk (state=3): >>># cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix <<< 22225 1726882750.41797: stdout chunk (state=3): >>># cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib <<< 22225 1726882750.41800: stdout chunk (state=3): >>># cleanup[3] wiping sys # cleanup[3] wiping builtins <<< 22225 1726882750.41803: stdout chunk (state=3): >>># destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 22225 1726882750.41943: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 22225 1726882750.41963: stdout chunk (state=3): >>># destroy _collections <<< 22225 1726882750.41994: stdout chunk (state=3): >>># destroy platform # destroy _uuid <<< 22225 1726882750.41997: stdout chunk (state=3): >>># destroy stat # destroy genericpath # destroy re._parser <<< 22225 1726882750.42026: stdout chunk (state=3): >>># destroy tokenize <<< 22225 1726882750.42036: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib <<< 22225 1726882750.42052: stdout chunk (state=3): >>># destroy copyreg # destroy contextlib <<< 22225 1726882750.42076: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize <<< 22225 1726882750.42080: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 22225 1726882750.42103: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal<<< 22225 1726882750.42109: stdout chunk (state=3): >>> <<< 22225 1726882750.42136: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 22225 1726882750.42235: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases <<< 22225 1726882750.42238: stdout chunk (state=3): >>># destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading <<< 22225 1726882750.42253: stdout chunk (state=3): >>># destroy atexit # destroy _warnings # destroy math # destroy _bisect <<< 22225 1726882750.42267: stdout chunk (state=3): >>># destroy time <<< 22225 1726882750.42280: stdout chunk (state=3): >>># destroy _random # destroy _weakref <<< 22225 1726882750.42314: stdout chunk (state=3): >>># destroy _operator # destroy _sha2 <<< 22225 1726882750.42327: stdout chunk (state=3): >>># destroy _sre # destroy _string # destroy re # destroy itertools <<< 22225 1726882750.42363: stdout chunk (state=3): >>># destroy _abc # destroy posix # destroy _functools # destroy builtins <<< 22225 1726882750.42367: stdout chunk (state=3): >>># destroy _thread <<< 22225 1726882750.42370: stdout chunk (state=3): >>># clear sys.audit hooks <<< 22225 1726882750.42754: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. <<< 22225 1726882750.42820: stderr chunk (state=3): >>><<< 22225 1726882750.42826: stdout chunk (state=3): >>><<< 22225 1726882750.42926: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51ec0530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51e8fb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51ec2ab0> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51c71190> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51c71fd0> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51cafe60> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51caff20> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51ce7890> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51ce7f20> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51cc7b30> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51cc5250> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51cad010> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51d0b800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51d0a420> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51cc6120> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51d08c50> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51d38830> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51cac290> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5f51d38ce0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51d38b90> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5f51d38f80> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51caadb0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51d39670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51d39340> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51d3a570> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51d547a0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5f51d55eb0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51d56d50> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5f51d573b0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51d562a0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5f51d57e30> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51d57560> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51d3a5d0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5f51a57cb0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5f51a80710> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51a80470> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5f51a80650> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5f51a808c0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51a55e50> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51a81f40> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51a80bf0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51d3acc0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51aae300> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51ac6480> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51b03260> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51b25a00> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51b03380> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51ac7110> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51944410> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51ac54c0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51a82ea0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f5f51944680> # zipimport: found 103 names in '/tmp/ansible_setup_payload_56kyque2/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f519b20f0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51988fe0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51988170> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f5198bf80> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5f519ddb50> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f519dd8e0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f519dd1f0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f519dd640> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f519b2d80> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5f519de8d0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5f519deb10> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f519defc0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51848da0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5f5184a9c0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f5184b2c0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f5184c4a0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f5184ef60> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5f5184f050> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f5184d220> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51852db0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f518518b0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51851610> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51853ce0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f5184d730> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5f51896ed0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f518970e0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5f51898bf0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f518989b0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5f5189b170> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f518992e0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f518a6960> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f5189b2f0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5f518a7770> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5f518a79e0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5f518a7380> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f518972f0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5f518ab470> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5f518ac860> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f518a9be0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5f518aaf90> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f518a97f0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5f517349b0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51735760> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f518af2f0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f517354c0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51737ce0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5f5173e300> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5f5173ec60> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f518acaa0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5f5173d8e0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f5173eed0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f517cef60> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f5174bdd0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51746e10> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f51746c60> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f517d5d60> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f50d44500> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5f50d44860> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f517b5580> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f517b4800> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f517d4440> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f517d4dd0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5f50d47860> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f50d47140> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5f50d472f0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f50d46570> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f50d479b0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5f50dae4e0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f50dac500> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f517d54f0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f50dae7b0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f50daf440> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5f50de2780> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f50dca480> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5f50dfe1e0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f50dffb60> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5f50bab8f0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f50ba8650> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5f50ba83b0> {"ansible_facts": {"ansible_local": {}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDgEY5v0GEtCYtch7tJojQQAk4PbH2olIXeGHTQIm8P5pFfIFYEIOJVnRYq1iTcsBmLCTgtGM0uASSyO5gXiTJv1YA3W6bzq+KxIdoX/yvSXE7c8N6e/7sMCf9vq2o8xdS3RQTFVoQhj8zkjID057q3vE1D1ocBhYrbVTg76i1ZqUd+ePIBmv/FpJM5vb0YoL7gkfS3SFIJPuVIEqIfwZenELkhPa6MfTG3y/T8+Y4mRwbn7AmsxfBpApbj+TjvOT1vEjw0nBgVhen2pB+dpX/dtpPZiqrQgfcCF+gcf5MY2k2QbuXKKc1iESrpq3sm4as3n9bYr/2i3c3+5PTFN/CKnMpbejvhd+srQgr8UXt57pXIrXxBNe9thnfdeyp9mTxpprGooVC/CHvQ11TbipHGql4+nW9TSIg4s+WIoJGBb5REpn5hh2HmL0/W+Bhet//pxt9ENJxKyunBZToUbqQ2wjkR25JMThiZ6lKuzIRlIAK4i02pPGNUi28QgUDqdR8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKwVnf2m/NbfCFFK6wM50dwDuEJIaiTFh2d8mqI8HYkr65RHvztqJ3ibHa48thfVy5T7cZ8XqhpqkqfQd1OIshs=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIGUhLLUPg0qNUe+aRCNNpHWNDNP2CUtjsgQPqFujAjMC", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_apparmor": {"status": "disabled"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "39", "second": "09", "epoch": "1726882749", "epoch_int": "1726882749", "date": "2024-09-20", "time": "21:39:09", "iso8601_micro": "2024-09-21T01:39:09.085101Z", "iso8601": "2024-09-21T01:39:09Z", "iso8601_basic": "20240920T213909085101", "iso8601_basic_short": "20240920T213909", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_lsb": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system": "Linux", "ansible_kernel": "6.10.9-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Sun Sep 8 17:23:55 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "ip-10-31-15-7.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-15-7", "ansible_nodename": "ip-10-31-15-7.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22da150943bd63960f74a27c08f190", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.180 60558 10.31.15.7 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.180 60558 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_service_mgr": "systemd", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_fips": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_pkg_mgr": "dnf", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 22225 1726882750.43810: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882748.5000541-22387-49479861508741/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22225 1726882750.43814: _low_level_execute_command(): starting 22225 1726882750.43816: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882748.5000541-22387-49479861508741/ > /dev/null 2>&1 && sleep 0' 22225 1726882750.43819: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22225 1726882750.43823: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882750.43827: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882750.43829: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882750.43832: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 <<< 22225 1726882750.43834: stderr chunk (state=3): >>>debug2: match not found <<< 22225 1726882750.43836: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882750.43838: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 22225 1726882750.43850: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.7 is address <<< 22225 1726882750.43853: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882750.43856: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882750.43872: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found <<< 22225 1726882750.43874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882750.43937: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882750.43941: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882750.43945: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882750.44000: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882750.45907: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882750.45963: stderr chunk (state=3): >>><<< 22225 1726882750.45966: stdout chunk (state=3): >>><<< 22225 1726882750.45982: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882750.45990: handler run complete 22225 1726882750.46032: variable 'ansible_facts' from source: unknown 22225 1726882750.46070: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882750.46155: variable 'ansible_facts' from source: unknown 22225 1726882750.46204: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882750.46245: attempt loop complete, returning result 22225 1726882750.46248: _execute() done 22225 1726882750.46250: dumping result to json 22225 1726882750.46261: done dumping result, returning 22225 1726882750.46269: done running TaskExecutor() for managed_node1/TASK: Gather the minimum subset of ansible_facts required by the network role test [0affc7ec-ae25-ec05-55b7-0000000000ca] 22225 1726882750.46274: sending task result for task 0affc7ec-ae25-ec05-55b7-0000000000ca 22225 1726882750.46412: done sending task result for task 0affc7ec-ae25-ec05-55b7-0000000000ca 22225 1726882750.46414: WORKER PROCESS EXITING ok: [managed_node1] 22225 1726882750.46533: no more pending results, returning what we have 22225 1726882750.46536: results queue empty 22225 1726882750.46537: checking for any_errors_fatal 22225 1726882750.46538: done checking for any_errors_fatal 22225 1726882750.46539: checking for max_fail_percentage 22225 1726882750.46541: done checking for max_fail_percentage 22225 1726882750.46542: checking to see if all hosts have failed and the running result is not ok 22225 1726882750.46542: done checking to see if all hosts have failed 22225 1726882750.46543: getting the remaining hosts for this loop 22225 1726882750.46545: done getting the remaining hosts for this loop 22225 1726882750.46549: getting the next task for host managed_node1 22225 1726882750.46557: done getting next task for host managed_node1 22225 1726882750.46561: ^ task is: TASK: Check if system is ostree 22225 1726882750.46563: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882750.46566: getting variables 22225 1726882750.46568: in VariableManager get_vars() 22225 1726882750.46594: Calling all_inventory to load vars for managed_node1 22225 1726882750.46597: Calling groups_inventory to load vars for managed_node1 22225 1726882750.46599: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882750.46609: Calling all_plugins_play to load vars for managed_node1 22225 1726882750.46612: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882750.46614: Calling groups_plugins_play to load vars for managed_node1 22225 1726882750.46794: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882750.46932: done with get_vars() 22225 1726882750.46940: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Friday 20 September 2024 21:39:10 -0400 (0:00:02.123) 0:00:05.862 ****** 22225 1726882750.47013: entering _queue_task() for managed_node1/stat 22225 1726882750.47225: worker is 1 (out of 1 available) 22225 1726882750.47240: exiting _queue_task() for managed_node1/stat 22225 1726882750.47250: done queuing things up, now waiting for results queue to drain 22225 1726882750.47251: waiting for pending results... 22225 1726882750.47408: running TaskExecutor() for managed_node1/TASK: Check if system is ostree 22225 1726882750.47483: in run() - task 0affc7ec-ae25-ec05-55b7-0000000000cc 22225 1726882750.47493: variable 'ansible_search_path' from source: unknown 22225 1726882750.47497: variable 'ansible_search_path' from source: unknown 22225 1726882750.47526: calling self._execute() 22225 1726882750.47582: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882750.47591: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882750.47599: variable 'omit' from source: magic vars 22225 1726882750.47963: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22225 1726882750.48153: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22225 1726882750.48183: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22225 1726882750.48210: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22225 1726882750.48257: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22225 1726882750.48327: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22225 1726882750.48346: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22225 1726882750.48366: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22225 1726882750.48391: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22225 1726882750.48489: Evaluated conditional (not __network_is_ostree is defined): True 22225 1726882750.48493: variable 'omit' from source: magic vars 22225 1726882750.48521: variable 'omit' from source: magic vars 22225 1726882750.48549: variable 'omit' from source: magic vars 22225 1726882750.48570: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22225 1726882750.48595: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22225 1726882750.48611: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22225 1726882750.48626: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882750.48635: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882750.48659: variable 'inventory_hostname' from source: host vars for 'managed_node1' 22225 1726882750.48662: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882750.48664: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882750.48738: Set connection var ansible_connection to ssh 22225 1726882750.48746: Set connection var ansible_pipelining to False 22225 1726882750.48754: Set connection var ansible_shell_executable to /bin/sh 22225 1726882750.48759: Set connection var ansible_timeout to 10 22225 1726882750.48762: Set connection var ansible_shell_type to sh 22225 1726882750.48767: Set connection var ansible_module_compression to ZIP_DEFLATED 22225 1726882750.48791: variable 'ansible_shell_executable' from source: unknown 22225 1726882750.48794: variable 'ansible_connection' from source: unknown 22225 1726882750.48797: variable 'ansible_module_compression' from source: unknown 22225 1726882750.48799: variable 'ansible_shell_type' from source: unknown 22225 1726882750.48802: variable 'ansible_shell_executable' from source: unknown 22225 1726882750.48805: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882750.48808: variable 'ansible_pipelining' from source: unknown 22225 1726882750.48810: variable 'ansible_timeout' from source: unknown 22225 1726882750.48812: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882750.48918: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 22225 1726882750.48928: variable 'omit' from source: magic vars 22225 1726882750.48940: starting attempt loop 22225 1726882750.48943: running the handler 22225 1726882750.48950: _low_level_execute_command(): starting 22225 1726882750.48956: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22225 1726882750.49486: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882750.49490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882750.49492: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882750.49494: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882750.49553: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882750.49556: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882750.49558: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882750.49623: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882750.51319: stdout chunk (state=3): >>>/root <<< 22225 1726882750.51431: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882750.51486: stderr chunk (state=3): >>><<< 22225 1726882750.51490: stdout chunk (state=3): >>><<< 22225 1726882750.51512: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882750.51526: _low_level_execute_command(): starting 22225 1726882750.51533: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882750.515113-22459-167927602550723 `" && echo ansible-tmp-1726882750.515113-22459-167927602550723="` echo /root/.ansible/tmp/ansible-tmp-1726882750.515113-22459-167927602550723 `" ) && sleep 0' 22225 1726882750.52010: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882750.52014: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882750.52016: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882750.52019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882750.52068: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882750.52072: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882750.52076: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882750.52132: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882750.54117: stdout chunk (state=3): >>>ansible-tmp-1726882750.515113-22459-167927602550723=/root/.ansible/tmp/ansible-tmp-1726882750.515113-22459-167927602550723 <<< 22225 1726882750.54232: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882750.54289: stderr chunk (state=3): >>><<< 22225 1726882750.54292: stdout chunk (state=3): >>><<< 22225 1726882750.54304: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882750.515113-22459-167927602550723=/root/.ansible/tmp/ansible-tmp-1726882750.515113-22459-167927602550723 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882750.54349: variable 'ansible_module_compression' from source: unknown 22225 1726882750.54398: ANSIBALLZ: Using lock for stat 22225 1726882750.54401: ANSIBALLZ: Acquiring lock 22225 1726882750.54404: ANSIBALLZ: Lock acquired: 140272895055664 22225 1726882750.54406: ANSIBALLZ: Creating module 22225 1726882750.62528: ANSIBALLZ: Writing module into payload 22225 1726882750.62597: ANSIBALLZ: Writing module 22225 1726882750.62613: ANSIBALLZ: Renaming module 22225 1726882750.62619: ANSIBALLZ: Done creating module 22225 1726882750.62636: variable 'ansible_facts' from source: unknown 22225 1726882750.62686: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882750.515113-22459-167927602550723/AnsiballZ_stat.py 22225 1726882750.62786: Sending initial data 22225 1726882750.62790: Sent initial data (152 bytes) 22225 1726882750.63271: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882750.63275: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882750.63277: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address <<< 22225 1726882750.63279: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882750.63285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882750.63328: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882750.63335: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882750.63350: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882750.63405: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882750.65141: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 22225 1726882750.65148: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22225 1726882750.65191: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22225 1726882750.65243: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2222502xecypi/tmp2bijf2qy /root/.ansible/tmp/ansible-tmp-1726882750.515113-22459-167927602550723/AnsiballZ_stat.py <<< 22225 1726882750.65250: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882750.515113-22459-167927602550723/AnsiballZ_stat.py" <<< 22225 1726882750.65296: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-2222502xecypi/tmp2bijf2qy" to remote "/root/.ansible/tmp/ansible-tmp-1726882750.515113-22459-167927602550723/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882750.515113-22459-167927602550723/AnsiballZ_stat.py" <<< 22225 1726882750.65875: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882750.65940: stderr chunk (state=3): >>><<< 22225 1726882750.65943: stdout chunk (state=3): >>><<< 22225 1726882750.65963: done transferring module to remote 22225 1726882750.65977: _low_level_execute_command(): starting 22225 1726882750.65980: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882750.515113-22459-167927602550723/ /root/.ansible/tmp/ansible-tmp-1726882750.515113-22459-167927602550723/AnsiballZ_stat.py && sleep 0' 22225 1726882750.66437: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882750.66441: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found <<< 22225 1726882750.66444: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address <<< 22225 1726882750.66446: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882750.66448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882750.66498: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882750.66503: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882750.66556: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882750.68399: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882750.68446: stderr chunk (state=3): >>><<< 22225 1726882750.68449: stdout chunk (state=3): >>><<< 22225 1726882750.68462: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882750.68465: _low_level_execute_command(): starting 22225 1726882750.68473: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882750.515113-22459-167927602550723/AnsiballZ_stat.py && sleep 0' 22225 1726882750.68904: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882750.68908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882750.68910: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address <<< 22225 1726882750.68912: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882750.68915: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882750.68966: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882750.68974: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882750.69027: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882750.71332: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 22225 1726882750.71359: stdout chunk (state=3): >>>import _imp # builtin <<< 22225 1726882750.71401: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # <<< 22225 1726882750.71403: stdout chunk (state=3): >>>import '_weakref' # <<< 22225 1726882750.71469: stdout chunk (state=3): >>>import '_io' # <<< 22225 1726882750.71475: stdout chunk (state=3): >>>import 'marshal' # <<< 22225 1726882750.71508: stdout chunk (state=3): >>>import 'posix' # <<< 22225 1726882750.71547: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 22225 1726882750.71570: stdout chunk (state=3): >>>import 'time' # <<< 22225 1726882750.71575: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 22225 1726882750.71633: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 22225 1726882750.71656: stdout chunk (state=3): >>>import '_codecs' # <<< 22225 1726882750.71673: stdout chunk (state=3): >>>import 'codecs' # <<< 22225 1726882750.71718: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 22225 1726882750.71735: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 22225 1726882750.71750: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24ba2c0530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24ba28fb30> <<< 22225 1726882750.71788: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 22225 1726882750.71793: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24ba2c2ab0> <<< 22225 1726882750.71809: stdout chunk (state=3): >>>import '_signal' # <<< 22225 1726882750.71840: stdout chunk (state=3): >>>import '_abc' # import 'abc' # <<< 22225 1726882750.71863: stdout chunk (state=3): >>>import 'io' # <<< 22225 1726882750.71897: stdout chunk (state=3): >>>import '_stat' # <<< 22225 1726882750.71903: stdout chunk (state=3): >>>import 'stat' # <<< 22225 1726882750.72004: stdout chunk (state=3): >>>import '_collections_abc' # <<< 22225 1726882750.72055: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 22225 1726882750.72073: stdout chunk (state=3): >>>import 'os' # <<< 22225 1726882750.72095: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 22225 1726882750.72111: stdout chunk (state=3): >>>Processing user site-packages <<< 22225 1726882750.72148: stdout chunk (state=3): >>>Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' <<< 22225 1726882750.72161: stdout chunk (state=3): >>>Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 22225 1726882750.72164: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py <<< 22225 1726882750.72195: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 22225 1726882750.72199: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24ba095190> <<< 22225 1726882750.72269: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 22225 1726882750.72273: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 22225 1726882750.72297: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24ba095fd0> <<< 22225 1726882750.72331: stdout chunk (state=3): >>>import 'site' # <<< 22225 1726882750.72358: stdout chunk (state=3): >>>Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 22225 1726882750.72656: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 22225 1726882750.72663: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 22225 1726882750.72698: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 22225 1726882750.72723: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 22225 1726882750.72767: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 22225 1726882750.72787: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 22225 1726882750.72814: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 22225 1726882750.72854: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24ba0d3e60> <<< 22225 1726882750.72858: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 22225 1726882750.72876: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 22225 1726882750.72895: stdout chunk (state=3): >>>import '_operator' # <<< 22225 1726882750.72908: stdout chunk (state=3): >>>import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24ba0d3f20> <<< 22225 1726882750.72928: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 22225 1726882750.72964: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 22225 1726882750.72992: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 22225 1726882750.73039: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 22225 1726882750.73061: stdout chunk (state=3): >>>import 'itertools' # <<< 22225 1726882750.73091: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py <<< 22225 1726882750.73101: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24ba10b830> <<< 22225 1726882750.73110: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 22225 1726882750.73139: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24ba10bec0> <<< 22225 1726882750.73147: stdout chunk (state=3): >>>import '_collections' # <<< 22225 1726882750.73194: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24ba0ebb30> <<< 22225 1726882750.73215: stdout chunk (state=3): >>>import '_functools' # <<< 22225 1726882750.73260: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24ba0e9250> <<< 22225 1726882750.73368: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24ba0d1010> <<< 22225 1726882750.73402: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 22225 1726882750.73423: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # <<< 22225 1726882750.73455: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 22225 1726882750.73474: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 22225 1726882750.73515: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 22225 1726882750.73548: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24ba12f830> <<< 22225 1726882750.73567: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24ba12e450> <<< 22225 1726882750.73601: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24ba0ea120> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24ba12cbc0> <<< 22225 1726882750.73676: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 22225 1726882750.73680: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' <<< 22225 1726882750.73688: stdout chunk (state=3): >>>import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24ba15c860> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24ba0d02c0> <<< 22225 1726882750.73706: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py <<< 22225 1726882750.73713: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 22225 1726882750.73731: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 22225 1726882750.73761: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24ba15cd10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24ba15cbc0> <<< 22225 1726882750.73787: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 22225 1726882750.73800: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24ba15cf80> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24ba0cede0> <<< 22225 1726882750.73845: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 22225 1726882750.73860: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 22225 1726882750.73893: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 22225 1726882750.73918: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24ba15d670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24ba15d340> import 'importlib.machinery' # <<< 22225 1726882750.73953: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 22225 1726882750.73985: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24ba15e570> <<< 22225 1726882750.73988: stdout chunk (state=3): >>>import 'importlib.util' # <<< 22225 1726882750.74007: stdout chunk (state=3): >>>import 'runpy' # <<< 22225 1726882750.74048: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 22225 1726882750.74064: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 22225 1726882750.74115: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py <<< 22225 1726882750.74119: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24ba1787a0> import 'errno' # <<< 22225 1726882750.74141: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 22225 1726882750.74164: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24ba179ee0> <<< 22225 1726882750.74183: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 22225 1726882750.74212: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py <<< 22225 1726882750.74219: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 22225 1726882750.74241: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24ba17ad80> <<< 22225 1726882750.74271: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' <<< 22225 1726882750.74293: stdout chunk (state=3): >>># extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' <<< 22225 1726882750.74328: stdout chunk (state=3): >>>import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24ba17b3b0> <<< 22225 1726882750.74338: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24ba17a2d0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 22225 1726882750.74369: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 22225 1726882750.74384: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24ba17bdd0> <<< 22225 1726882750.74394: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24ba17b530> <<< 22225 1726882750.74444: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24ba15e5d0> <<< 22225 1726882750.74459: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 22225 1726882750.74498: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 22225 1726882750.74510: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 22225 1726882750.74536: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 22225 1726882750.74568: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24b9f43ce0> <<< 22225 1726882750.74594: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 22225 1726882750.74631: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' <<< 22225 1726882750.74636: stdout chunk (state=3): >>># extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24b9f6c770> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9f6c4d0> <<< 22225 1726882750.74659: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24b9f6c7a0> <<< 22225 1726882750.74683: stdout chunk (state=3): >>># extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' <<< 22225 1726882750.74691: stdout chunk (state=3): >>># extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24b9f6c980> <<< 22225 1726882750.74701: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9f41e80> <<< 22225 1726882750.74729: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 22225 1726882750.74838: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 22225 1726882750.74861: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 22225 1726882750.74877: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 22225 1726882750.74885: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9f6df70> <<< 22225 1726882750.74903: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9f6cc20> <<< 22225 1726882750.74931: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24ba15ecc0> <<< 22225 1726882750.74950: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 22225 1726882750.75006: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 22225 1726882750.75025: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 22225 1726882750.75072: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 22225 1726882750.75102: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9f9a2d0> <<< 22225 1726882750.75160: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 22225 1726882750.75163: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 22225 1726882750.75193: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 22225 1726882750.75206: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 22225 1726882750.75261: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9fb23c0> <<< 22225 1726882750.75278: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 22225 1726882750.75324: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 22225 1726882750.75376: stdout chunk (state=3): >>>import 'ntpath' # <<< 22225 1726882750.75403: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9fef110> <<< 22225 1726882750.75426: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 22225 1726882750.75462: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 22225 1726882750.75493: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 22225 1726882750.75534: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 22225 1726882750.75629: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24ba015880> <<< 22225 1726882750.75703: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9fef200> <<< 22225 1726882750.75750: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9fb3050> <<< 22225 1726882750.75774: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py <<< 22225 1726882750.75781: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9df01d0> <<< 22225 1726882750.75799: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9fb1400> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9f6ee70> <<< 22225 1726882750.75902: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 22225 1726882750.75928: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f24b9fb1520> <<< 22225 1726882750.76007: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_6umniq4r/ansible_stat_payload.zip' <<< 22225 1726882750.76010: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882750.76157: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882750.76192: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 22225 1726882750.76246: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 22225 1726882750.76318: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 22225 1726882750.76358: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9e45eb0> <<< 22225 1726882750.76365: stdout chunk (state=3): >>>import '_typing' # <<< 22225 1726882750.76570: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9e1cda0> <<< 22225 1726882750.76573: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9df3ef0> # zipimport: zlib available <<< 22225 1726882750.76603: stdout chunk (state=3): >>>import 'ansible' # <<< 22225 1726882750.76610: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882750.76633: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882750.76646: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882750.76658: stdout chunk (state=3): >>>import 'ansible.module_utils' # <<< 22225 1726882750.76669: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882750.78238: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882750.79506: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9e1fd40> <<< 22225 1726882750.79542: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 22225 1726882750.79578: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 22225 1726882750.79585: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 22225 1726882750.79609: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 22225 1726882750.79647: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24b9e6d880> <<< 22225 1726882750.79695: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9e6d610> <<< 22225 1726882750.79727: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9e6cf20> <<< 22225 1726882750.79754: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 22225 1726882750.79758: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 22225 1726882750.79806: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9e6d9a0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9e468d0> <<< 22225 1726882750.79819: stdout chunk (state=3): >>>import 'atexit' # <<< 22225 1726882750.79848: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' <<< 22225 1726882750.79851: stdout chunk (state=3): >>># extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24b9e6e570> <<< 22225 1726882750.79877: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' <<< 22225 1726882750.79890: stdout chunk (state=3): >>># extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24b9e6e7b0> <<< 22225 1726882750.79904: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 22225 1726882750.79967: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 22225 1726882750.79969: stdout chunk (state=3): >>>import '_locale' # <<< 22225 1726882750.80025: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9e6ecf0> <<< 22225 1726882750.80031: stdout chunk (state=3): >>>import 'pwd' # <<< 22225 1726882750.80056: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 22225 1726882750.80077: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 22225 1726882750.80125: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9cd4a10> <<< 22225 1726882750.80145: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 22225 1726882750.80174: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24b9cd6630> <<< 22225 1726882750.80176: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 22225 1726882750.80201: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 22225 1726882750.80239: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9cd6ff0> <<< 22225 1726882750.80264: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 22225 1726882750.80288: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 22225 1726882750.80311: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9cd7f80> <<< 22225 1726882750.80332: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 22225 1726882750.80373: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 22225 1726882750.80398: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py <<< 22225 1726882750.80402: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 22225 1726882750.80460: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9cdac60> <<< 22225 1726882750.80501: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24b9cdafc0> <<< 22225 1726882750.80528: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9cd8f20> <<< 22225 1726882750.80548: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 22225 1726882750.80577: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 22225 1726882750.80602: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 22225 1726882750.80632: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 22225 1726882750.80665: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 22225 1726882750.80688: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 22225 1726882750.80697: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9cdeb10> <<< 22225 1726882750.80719: stdout chunk (state=3): >>>import '_tokenize' # <<< 22225 1726882750.80820: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9cdd5e0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9cdd340> <<< 22225 1726882750.80825: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 22225 1726882750.80905: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9cdfda0> <<< 22225 1726882750.80953: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9cd9430> <<< 22225 1726882750.80989: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24b9d26cc0> <<< 22225 1726882750.81005: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9d26ea0> <<< 22225 1726882750.81053: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 22225 1726882750.81072: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 22225 1726882750.81126: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24b9d28950> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9d28710> <<< 22225 1726882750.81137: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 22225 1726882750.81265: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 22225 1726882750.81324: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24b9d2aed0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9d29010> <<< 22225 1726882750.81346: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 22225 1726882750.81424: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 22225 1726882750.81427: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 22225 1726882750.81451: stdout chunk (state=3): >>>import '_string' # <<< 22225 1726882750.81479: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9d326f0> <<< 22225 1726882750.81626: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9d2b080> <<< 22225 1726882750.81698: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24b9d334d0> <<< 22225 1726882750.81743: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24b9d338f0> <<< 22225 1726882750.81791: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24b9d33830> <<< 22225 1726882750.81808: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9d26f90> <<< 22225 1726882750.81840: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 22225 1726882750.81862: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 22225 1726882750.81881: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 22225 1726882750.81903: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 22225 1726882750.81941: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24b9d37140> <<< 22225 1726882750.82184: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24b9d38320> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9d358b0> <<< 22225 1726882750.82190: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24b9d36c60> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9d35520> <<< 22225 1726882750.82220: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882750.82226: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 22225 1726882750.82325: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882750.82410: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882750.82439: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # <<< 22225 1726882750.82448: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882750.82467: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 22225 1726882750.82483: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882750.82614: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882750.82747: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882750.83363: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882750.83985: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # <<< 22225 1726882750.83992: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 22225 1726882750.84026: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 22225 1726882750.84040: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 22225 1726882750.84096: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24b9dbc590> <<< 22225 1726882750.84191: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 22225 1726882750.84228: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9dbd2e0> <<< 22225 1726882750.84251: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9d3b830> <<< 22225 1726882750.84286: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 22225 1726882750.84315: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882750.84330: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882750.84349: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # # zipimport: zlib available <<< 22225 1726882750.84515: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882750.84699: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 22225 1726882750.84708: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9dbd3a0> <<< 22225 1726882750.84719: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882750.85240: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882750.85746: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882750.85828: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882750.85911: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 22225 1726882750.85928: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882750.85963: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882750.86007: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 22225 1726882750.86011: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882750.86094: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882750.86179: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 22225 1726882750.86202: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882750.86232: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 22225 1726882750.86283: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882750.86316: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 22225 1726882750.86334: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882750.86593: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882750.86863: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 22225 1726882750.86929: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 22225 1726882750.86948: stdout chunk (state=3): >>>import '_ast' # <<< 22225 1726882750.87028: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9dbfc50> <<< 22225 1726882750.87038: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882750.87129: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882750.87207: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 22225 1726882750.87240: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 22225 1726882750.87251: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 22225 1726882750.87338: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 22225 1726882750.87466: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24b9bcddf0> <<< 22225 1726882750.87538: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24b9bce750> <<< 22225 1726882750.87562: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9dbe9f0> # zipimport: zlib available <<< 22225 1726882750.87608: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882750.87663: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 22225 1726882750.87666: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882750.87712: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882750.87760: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882750.87826: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882750.87901: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 22225 1726882750.87950: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 22225 1726882750.88048: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24b9bcd370> <<< 22225 1726882750.88093: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9bce8d0> <<< 22225 1726882750.88132: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 22225 1726882750.88144: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882750.88212: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882750.88287: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882750.88317: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882750.88363: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 22225 1726882750.88391: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 22225 1726882750.88426: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 22225 1726882750.88450: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 22225 1726882750.88514: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 22225 1726882750.88541: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 22225 1726882750.88605: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9c5eb40> <<< 22225 1726882750.88658: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9bd8860> <<< 22225 1726882750.88768: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9bd2900> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9bd2750> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 22225 1726882750.88779: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882750.88802: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882750.88842: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 22225 1726882750.88920: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 22225 1726882750.88925: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882750.88950: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 22225 1726882750.89106: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882750.89325: stdout chunk (state=3): >>># zipimport: zlib available <<< 22225 1726882750.89468: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ <<< 22225 1726882750.89837: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp <<< 22225 1726882750.89873: stdout chunk (state=3): >>># cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc <<< 22225 1726882750.89891: stdout chunk (state=3): >>># cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma <<< 22225 1726882750.89972: stdout chunk (state=3): >>># cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible <<< 22225 1726882750.89975: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy <<< 22225 1726882750.90017: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 22225 1726882750.90288: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 22225 1726882750.90332: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression <<< 22225 1726882750.90358: stdout chunk (state=3): >>># destroy _lzma # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress <<< 22225 1726882750.90417: stdout chunk (state=3): >>># destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json <<< 22225 1726882750.90471: stdout chunk (state=3): >>># destroy grp # destroy encodings # destroy _locale # destroy pwd <<< 22225 1726882750.90504: stdout chunk (state=3): >>># destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime <<< 22225 1726882750.90547: stdout chunk (state=3): >>># destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil <<< 22225 1726882750.90560: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess <<< 22225 1726882750.90627: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian <<< 22225 1726882750.90663: stdout chunk (state=3): >>># cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize <<< 22225 1726882750.90684: stdout chunk (state=3): >>># cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings <<< 22225 1726882750.90775: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator <<< 22225 1726882750.90781: stdout chunk (state=3): >>># cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os <<< 22225 1726882750.90846: stdout chunk (state=3): >>># destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys <<< 22225 1726882750.90850: stdout chunk (state=3): >>># cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 22225 1726882750.90985: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 22225 1726882750.91027: stdout chunk (state=3): >>># destroy _collections <<< 22225 1726882750.91030: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser <<< 22225 1726882750.91105: stdout chunk (state=3): >>># destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 22225 1726882750.91111: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser <<< 22225 1726882750.91136: stdout chunk (state=3): >>># destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib<<< 22225 1726882750.91158: stdout chunk (state=3): >>> <<< 22225 1726882750.91250: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading <<< 22225 1726882750.91275: stdout chunk (state=3): >>># destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time <<< 22225 1726882750.91297: stdout chunk (state=3): >>># destroy _random # destroy _weakref <<< 22225 1726882750.91355: stdout chunk (state=3): >>># destroy _operator # destroy _sha2 # destroy _string # destroy re # destroy itertools <<< 22225 1726882750.91359: stdout chunk (state=3): >>># destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread <<< 22225 1726882750.91375: stdout chunk (state=3): >>># clear sys.audit hooks <<< 22225 1726882750.91838: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882750.91868: stderr chunk (state=3): >>>Shared connection to 10.31.15.7 closed. <<< 22225 1726882750.91871: stdout chunk (state=3): >>><<< 22225 1726882750.91873: stderr chunk (state=3): >>><<< 22225 1726882750.91949: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24ba2c0530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24ba28fb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24ba2c2ab0> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24ba095190> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24ba095fd0> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24ba0d3e60> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24ba0d3f20> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24ba10b830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24ba10bec0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24ba0ebb30> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24ba0e9250> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24ba0d1010> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24ba12f830> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24ba12e450> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24ba0ea120> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24ba12cbc0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24ba15c860> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24ba0d02c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24ba15cd10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24ba15cbc0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24ba15cf80> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24ba0cede0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24ba15d670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24ba15d340> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24ba15e570> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24ba1787a0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24ba179ee0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24ba17ad80> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24ba17b3b0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24ba17a2d0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24ba17bdd0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24ba17b530> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24ba15e5d0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24b9f43ce0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24b9f6c770> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9f6c4d0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24b9f6c7a0> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24b9f6c980> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9f41e80> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9f6df70> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9f6cc20> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24ba15ecc0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9f9a2d0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9fb23c0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9fef110> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24ba015880> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9fef200> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9fb3050> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9df01d0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9fb1400> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9f6ee70> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f24b9fb1520> # zipimport: found 30 names in '/tmp/ansible_stat_payload_6umniq4r/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9e45eb0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9e1cda0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9df3ef0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9e1fd40> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24b9e6d880> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9e6d610> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9e6cf20> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9e6d9a0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9e468d0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24b9e6e570> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24b9e6e7b0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9e6ecf0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9cd4a10> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24b9cd6630> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9cd6ff0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9cd7f80> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9cdac60> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24b9cdafc0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9cd8f20> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9cdeb10> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9cdd5e0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9cdd340> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9cdfda0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9cd9430> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24b9d26cc0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9d26ea0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24b9d28950> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9d28710> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24b9d2aed0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9d29010> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9d326f0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9d2b080> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24b9d334d0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24b9d338f0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24b9d33830> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9d26f90> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24b9d37140> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24b9d38320> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9d358b0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24b9d36c60> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9d35520> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24b9dbc590> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9dbd2e0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9d3b830> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9dbd3a0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9dbfc50> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24b9bcddf0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24b9bce750> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9dbe9f0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f24b9bcd370> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9bce8d0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9c5eb40> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9bd8860> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9bd2900> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f24b9bd2750> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 22225 1726882750.93298: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882750.515113-22459-167927602550723/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22225 1726882750.93303: _low_level_execute_command(): starting 22225 1726882750.93306: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882750.515113-22459-167927602550723/ > /dev/null 2>&1 && sleep 0' 22225 1726882750.93342: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22225 1726882750.93357: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882750.93406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 <<< 22225 1726882750.93418: stderr chunk (state=3): >>>debug2: match found <<< 22225 1726882750.93434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882750.93519: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882750.93543: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882750.93558: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882750.93643: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882750.95685: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882750.95689: stdout chunk (state=3): >>><<< 22225 1726882750.95692: stderr chunk (state=3): >>><<< 22225 1726882750.95718: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882750.95733: handler run complete 22225 1726882750.95927: attempt loop complete, returning result 22225 1726882750.95931: _execute() done 22225 1726882750.95933: dumping result to json 22225 1726882750.95935: done dumping result, returning 22225 1726882750.95938: done running TaskExecutor() for managed_node1/TASK: Check if system is ostree [0affc7ec-ae25-ec05-55b7-0000000000cc] 22225 1726882750.95940: sending task result for task 0affc7ec-ae25-ec05-55b7-0000000000cc 22225 1726882750.96010: done sending task result for task 0affc7ec-ae25-ec05-55b7-0000000000cc 22225 1726882750.96013: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 22225 1726882750.96088: no more pending results, returning what we have 22225 1726882750.96092: results queue empty 22225 1726882750.96093: checking for any_errors_fatal 22225 1726882750.96101: done checking for any_errors_fatal 22225 1726882750.96101: checking for max_fail_percentage 22225 1726882750.96104: done checking for max_fail_percentage 22225 1726882750.96104: checking to see if all hosts have failed and the running result is not ok 22225 1726882750.96105: done checking to see if all hosts have failed 22225 1726882750.96106: getting the remaining hosts for this loop 22225 1726882750.96108: done getting the remaining hosts for this loop 22225 1726882750.96113: getting the next task for host managed_node1 22225 1726882750.96120: done getting next task for host managed_node1 22225 1726882750.96129: ^ task is: TASK: Set flag to indicate system is ostree 22225 1726882750.96133: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882750.96136: getting variables 22225 1726882750.96138: in VariableManager get_vars() 22225 1726882750.96169: Calling all_inventory to load vars for managed_node1 22225 1726882750.96171: Calling groups_inventory to load vars for managed_node1 22225 1726882750.96175: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882750.96189: Calling all_plugins_play to load vars for managed_node1 22225 1726882750.96192: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882750.96195: Calling groups_plugins_play to load vars for managed_node1 22225 1726882750.96676: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882750.96939: done with get_vars() 22225 1726882750.96950: done getting variables 22225 1726882750.97059: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Friday 20 September 2024 21:39:10 -0400 (0:00:00.500) 0:00:06.363 ****** 22225 1726882750.97090: entering _queue_task() for managed_node1/set_fact 22225 1726882750.97091: Creating lock for set_fact 22225 1726882750.97377: worker is 1 (out of 1 available) 22225 1726882750.97393: exiting _queue_task() for managed_node1/set_fact 22225 1726882750.97405: done queuing things up, now waiting for results queue to drain 22225 1726882750.97406: waiting for pending results... 22225 1726882750.97741: running TaskExecutor() for managed_node1/TASK: Set flag to indicate system is ostree 22225 1726882750.97828: in run() - task 0affc7ec-ae25-ec05-55b7-0000000000cd 22225 1726882750.97831: variable 'ansible_search_path' from source: unknown 22225 1726882750.97834: variable 'ansible_search_path' from source: unknown 22225 1726882750.97852: calling self._execute() 22225 1726882750.97941: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882750.97966: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882750.97969: variable 'omit' from source: magic vars 22225 1726882750.98631: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22225 1726882750.98830: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22225 1726882750.98890: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22225 1726882750.98931: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22225 1726882750.98976: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22225 1726882750.99079: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22225 1726882750.99112: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22225 1726882750.99146: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22225 1726882750.99188: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22225 1726882750.99327: Evaluated conditional (not __network_is_ostree is defined): True 22225 1726882750.99393: variable 'omit' from source: magic vars 22225 1726882750.99398: variable 'omit' from source: magic vars 22225 1726882750.99510: variable '__ostree_booted_stat' from source: set_fact 22225 1726882750.99610: variable 'omit' from source: magic vars 22225 1726882750.99613: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22225 1726882750.99631: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22225 1726882750.99653: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22225 1726882750.99677: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882750.99696: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882750.99745: variable 'inventory_hostname' from source: host vars for 'managed_node1' 22225 1726882750.99754: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882750.99763: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882750.99879: Set connection var ansible_connection to ssh 22225 1726882750.99941: Set connection var ansible_pipelining to False 22225 1726882750.99946: Set connection var ansible_shell_executable to /bin/sh 22225 1726882750.99949: Set connection var ansible_timeout to 10 22225 1726882750.99951: Set connection var ansible_shell_type to sh 22225 1726882750.99953: Set connection var ansible_module_compression to ZIP_DEFLATED 22225 1726882751.00029: variable 'ansible_shell_executable' from source: unknown 22225 1726882751.00033: variable 'ansible_connection' from source: unknown 22225 1726882751.00036: variable 'ansible_module_compression' from source: unknown 22225 1726882751.00038: variable 'ansible_shell_type' from source: unknown 22225 1726882751.00040: variable 'ansible_shell_executable' from source: unknown 22225 1726882751.00047: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882751.00050: variable 'ansible_pipelining' from source: unknown 22225 1726882751.00052: variable 'ansible_timeout' from source: unknown 22225 1726882751.00054: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882751.00149: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22225 1726882751.00175: variable 'omit' from source: magic vars 22225 1726882751.00189: starting attempt loop 22225 1726882751.00197: running the handler 22225 1726882751.00213: handler run complete 22225 1726882751.00267: attempt loop complete, returning result 22225 1726882751.00270: _execute() done 22225 1726882751.00273: dumping result to json 22225 1726882751.00275: done dumping result, returning 22225 1726882751.00279: done running TaskExecutor() for managed_node1/TASK: Set flag to indicate system is ostree [0affc7ec-ae25-ec05-55b7-0000000000cd] 22225 1726882751.00285: sending task result for task 0affc7ec-ae25-ec05-55b7-0000000000cd 22225 1726882751.00558: done sending task result for task 0affc7ec-ae25-ec05-55b7-0000000000cd 22225 1726882751.00562: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 22225 1726882751.00617: no more pending results, returning what we have 22225 1726882751.00620: results queue empty 22225 1726882751.00621: checking for any_errors_fatal 22225 1726882751.00628: done checking for any_errors_fatal 22225 1726882751.00629: checking for max_fail_percentage 22225 1726882751.00630: done checking for max_fail_percentage 22225 1726882751.00631: checking to see if all hosts have failed and the running result is not ok 22225 1726882751.00632: done checking to see if all hosts have failed 22225 1726882751.00633: getting the remaining hosts for this loop 22225 1726882751.00634: done getting the remaining hosts for this loop 22225 1726882751.00638: getting the next task for host managed_node1 22225 1726882751.00646: done getting next task for host managed_node1 22225 1726882751.00649: ^ task is: TASK: Fix CentOS6 Base repo 22225 1726882751.00652: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882751.00656: getting variables 22225 1726882751.00657: in VariableManager get_vars() 22225 1726882751.00740: Calling all_inventory to load vars for managed_node1 22225 1726882751.00744: Calling groups_inventory to load vars for managed_node1 22225 1726882751.00747: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882751.00758: Calling all_plugins_play to load vars for managed_node1 22225 1726882751.00761: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882751.00769: Calling groups_plugins_play to load vars for managed_node1 22225 1726882751.01049: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882751.01296: done with get_vars() 22225 1726882751.01305: done getting variables 22225 1726882751.01420: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Friday 20 September 2024 21:39:11 -0400 (0:00:00.043) 0:00:06.407 ****** 22225 1726882751.01454: entering _queue_task() for managed_node1/copy 22225 1726882751.01769: worker is 1 (out of 1 available) 22225 1726882751.01779: exiting _queue_task() for managed_node1/copy 22225 1726882751.01792: done queuing things up, now waiting for results queue to drain 22225 1726882751.01794: waiting for pending results... 22225 1726882751.02001: running TaskExecutor() for managed_node1/TASK: Fix CentOS6 Base repo 22225 1726882751.02105: in run() - task 0affc7ec-ae25-ec05-55b7-0000000000cf 22225 1726882751.02128: variable 'ansible_search_path' from source: unknown 22225 1726882751.02135: variable 'ansible_search_path' from source: unknown 22225 1726882751.02175: calling self._execute() 22225 1726882751.02260: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882751.02311: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882751.02315: variable 'omit' from source: magic vars 22225 1726882751.02796: variable 'ansible_distribution' from source: facts 22225 1726882751.02825: Evaluated conditional (ansible_distribution == 'CentOS'): False 22225 1726882751.02832: when evaluation is False, skipping this task 22225 1726882751.02840: _execute() done 22225 1726882751.02847: dumping result to json 22225 1726882751.02858: done dumping result, returning 22225 1726882751.02927: done running TaskExecutor() for managed_node1/TASK: Fix CentOS6 Base repo [0affc7ec-ae25-ec05-55b7-0000000000cf] 22225 1726882751.02930: sending task result for task 0affc7ec-ae25-ec05-55b7-0000000000cf 22225 1726882751.03132: done sending task result for task 0affc7ec-ae25-ec05-55b7-0000000000cf 22225 1726882751.03136: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution == 'CentOS'", "skip_reason": "Conditional result was False" } 22225 1726882751.03192: no more pending results, returning what we have 22225 1726882751.03195: results queue empty 22225 1726882751.03196: checking for any_errors_fatal 22225 1726882751.03200: done checking for any_errors_fatal 22225 1726882751.03201: checking for max_fail_percentage 22225 1726882751.03202: done checking for max_fail_percentage 22225 1726882751.03203: checking to see if all hosts have failed and the running result is not ok 22225 1726882751.03204: done checking to see if all hosts have failed 22225 1726882751.03205: getting the remaining hosts for this loop 22225 1726882751.03206: done getting the remaining hosts for this loop 22225 1726882751.03209: getting the next task for host managed_node1 22225 1726882751.03215: done getting next task for host managed_node1 22225 1726882751.03217: ^ task is: TASK: Include the task 'enable_epel.yml' 22225 1726882751.03220: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882751.03271: getting variables 22225 1726882751.03273: in VariableManager get_vars() 22225 1726882751.03298: Calling all_inventory to load vars for managed_node1 22225 1726882751.03301: Calling groups_inventory to load vars for managed_node1 22225 1726882751.03304: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882751.03313: Calling all_plugins_play to load vars for managed_node1 22225 1726882751.03316: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882751.03319: Calling groups_plugins_play to load vars for managed_node1 22225 1726882751.03513: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882751.03777: done with get_vars() 22225 1726882751.03789: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Friday 20 September 2024 21:39:11 -0400 (0:00:00.024) 0:00:06.431 ****** 22225 1726882751.03875: entering _queue_task() for managed_node1/include_tasks 22225 1726882751.04095: worker is 1 (out of 1 available) 22225 1726882751.04107: exiting _queue_task() for managed_node1/include_tasks 22225 1726882751.04117: done queuing things up, now waiting for results queue to drain 22225 1726882751.04119: waiting for pending results... 22225 1726882751.04355: running TaskExecutor() for managed_node1/TASK: Include the task 'enable_epel.yml' 22225 1726882751.04445: in run() - task 0affc7ec-ae25-ec05-55b7-0000000000d0 22225 1726882751.04458: variable 'ansible_search_path' from source: unknown 22225 1726882751.04462: variable 'ansible_search_path' from source: unknown 22225 1726882751.04496: calling self._execute() 22225 1726882751.04569: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882751.04576: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882751.04589: variable 'omit' from source: magic vars 22225 1726882751.05094: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22225 1726882751.07517: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22225 1726882751.07595: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22225 1726882751.07648: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22225 1726882751.07670: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22225 1726882751.07731: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22225 1726882751.07790: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22225 1726882751.07821: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22225 1726882751.07853: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22225 1726882751.07927: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22225 1726882751.07930: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22225 1726882751.08040: variable '__network_is_ostree' from source: set_fact 22225 1726882751.08062: Evaluated conditional (not __network_is_ostree | d(false)): True 22225 1726882751.08068: _execute() done 22225 1726882751.08071: dumping result to json 22225 1726882751.08074: done dumping result, returning 22225 1726882751.08127: done running TaskExecutor() for managed_node1/TASK: Include the task 'enable_epel.yml' [0affc7ec-ae25-ec05-55b7-0000000000d0] 22225 1726882751.08130: sending task result for task 0affc7ec-ae25-ec05-55b7-0000000000d0 22225 1726882751.08203: done sending task result for task 0affc7ec-ae25-ec05-55b7-0000000000d0 22225 1726882751.08207: WORKER PROCESS EXITING 22225 1726882751.08237: no more pending results, returning what we have 22225 1726882751.08243: in VariableManager get_vars() 22225 1726882751.08278: Calling all_inventory to load vars for managed_node1 22225 1726882751.08281: Calling groups_inventory to load vars for managed_node1 22225 1726882751.08287: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882751.08299: Calling all_plugins_play to load vars for managed_node1 22225 1726882751.08302: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882751.08305: Calling groups_plugins_play to load vars for managed_node1 22225 1726882751.08711: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882751.08950: done with get_vars() 22225 1726882751.08959: variable 'ansible_search_path' from source: unknown 22225 1726882751.08960: variable 'ansible_search_path' from source: unknown 22225 1726882751.09001: we have included files to process 22225 1726882751.09003: generating all_blocks data 22225 1726882751.09004: done generating all_blocks data 22225 1726882751.09010: processing included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 22225 1726882751.09012: loading included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 22225 1726882751.09014: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 22225 1726882751.09759: done processing included file 22225 1726882751.09762: iterating over new_blocks loaded from include file 22225 1726882751.09763: in VariableManager get_vars() 22225 1726882751.09775: done with get_vars() 22225 1726882751.09776: filtering new block on tags 22225 1726882751.09802: done filtering new block on tags 22225 1726882751.09804: in VariableManager get_vars() 22225 1726882751.09814: done with get_vars() 22225 1726882751.09816: filtering new block on tags 22225 1726882751.09830: done filtering new block on tags 22225 1726882751.09832: done iterating over new_blocks loaded from include file included: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed_node1 22225 1726882751.09838: extending task lists for all hosts with included blocks 22225 1726882751.09951: done extending task lists 22225 1726882751.09952: done processing included files 22225 1726882751.09953: results queue empty 22225 1726882751.09954: checking for any_errors_fatal 22225 1726882751.09957: done checking for any_errors_fatal 22225 1726882751.09958: checking for max_fail_percentage 22225 1726882751.09959: done checking for max_fail_percentage 22225 1726882751.09960: checking to see if all hosts have failed and the running result is not ok 22225 1726882751.09961: done checking to see if all hosts have failed 22225 1726882751.09962: getting the remaining hosts for this loop 22225 1726882751.09963: done getting the remaining hosts for this loop 22225 1726882751.09965: getting the next task for host managed_node1 22225 1726882751.09970: done getting next task for host managed_node1 22225 1726882751.09972: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 22225 1726882751.09975: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882751.09977: getting variables 22225 1726882751.09978: in VariableManager get_vars() 22225 1726882751.09988: Calling all_inventory to load vars for managed_node1 22225 1726882751.09990: Calling groups_inventory to load vars for managed_node1 22225 1726882751.09993: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882751.09998: Calling all_plugins_play to load vars for managed_node1 22225 1726882751.10006: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882751.10009: Calling groups_plugins_play to load vars for managed_node1 22225 1726882751.10193: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882751.10428: done with get_vars() 22225 1726882751.10438: done getting variables 22225 1726882751.10506: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 22225 1726882751.10721: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 40] ********************************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Friday 20 September 2024 21:39:11 -0400 (0:00:00.068) 0:00:06.500 ****** 22225 1726882751.10771: entering _queue_task() for managed_node1/command 22225 1726882751.10773: Creating lock for command 22225 1726882751.11061: worker is 1 (out of 1 available) 22225 1726882751.11073: exiting _queue_task() for managed_node1/command 22225 1726882751.11088: done queuing things up, now waiting for results queue to drain 22225 1726882751.11091: waiting for pending results... 22225 1726882751.11491: running TaskExecutor() for managed_node1/TASK: Create EPEL 40 22225 1726882751.11505: in run() - task 0affc7ec-ae25-ec05-55b7-0000000000ea 22225 1726882751.11526: variable 'ansible_search_path' from source: unknown 22225 1726882751.11536: variable 'ansible_search_path' from source: unknown 22225 1726882751.11586: calling self._execute() 22225 1726882751.11695: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882751.11699: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882751.11724: variable 'omit' from source: magic vars 22225 1726882751.12143: variable 'ansible_distribution' from source: facts 22225 1726882751.12241: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 22225 1726882751.12245: when evaluation is False, skipping this task 22225 1726882751.12249: _execute() done 22225 1726882751.12252: dumping result to json 22225 1726882751.12254: done dumping result, returning 22225 1726882751.12256: done running TaskExecutor() for managed_node1/TASK: Create EPEL 40 [0affc7ec-ae25-ec05-55b7-0000000000ea] 22225 1726882751.12259: sending task result for task 0affc7ec-ae25-ec05-55b7-0000000000ea 22225 1726882751.12425: done sending task result for task 0affc7ec-ae25-ec05-55b7-0000000000ea 22225 1726882751.12431: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 22225 1726882751.12489: no more pending results, returning what we have 22225 1726882751.12493: results queue empty 22225 1726882751.12493: checking for any_errors_fatal 22225 1726882751.12495: done checking for any_errors_fatal 22225 1726882751.12495: checking for max_fail_percentage 22225 1726882751.12497: done checking for max_fail_percentage 22225 1726882751.12498: checking to see if all hosts have failed and the running result is not ok 22225 1726882751.12498: done checking to see if all hosts have failed 22225 1726882751.12499: getting the remaining hosts for this loop 22225 1726882751.12501: done getting the remaining hosts for this loop 22225 1726882751.12504: getting the next task for host managed_node1 22225 1726882751.12510: done getting next task for host managed_node1 22225 1726882751.12512: ^ task is: TASK: Install yum-utils package 22225 1726882751.12516: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882751.12519: getting variables 22225 1726882751.12520: in VariableManager get_vars() 22225 1726882751.12548: Calling all_inventory to load vars for managed_node1 22225 1726882751.12550: Calling groups_inventory to load vars for managed_node1 22225 1726882751.12553: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882751.12563: Calling all_plugins_play to load vars for managed_node1 22225 1726882751.12566: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882751.12569: Calling groups_plugins_play to load vars for managed_node1 22225 1726882751.12762: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882751.13024: done with get_vars() 22225 1726882751.13034: done getting variables 22225 1726882751.13137: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Friday 20 September 2024 21:39:11 -0400 (0:00:00.023) 0:00:06.524 ****** 22225 1726882751.13167: entering _queue_task() for managed_node1/package 22225 1726882751.13169: Creating lock for package 22225 1726882751.13638: worker is 1 (out of 1 available) 22225 1726882751.13651: exiting _queue_task() for managed_node1/package 22225 1726882751.13661: done queuing things up, now waiting for results queue to drain 22225 1726882751.13662: waiting for pending results... 22225 1726882751.13745: running TaskExecutor() for managed_node1/TASK: Install yum-utils package 22225 1726882751.13847: in run() - task 0affc7ec-ae25-ec05-55b7-0000000000eb 22225 1726882751.13859: variable 'ansible_search_path' from source: unknown 22225 1726882751.13863: variable 'ansible_search_path' from source: unknown 22225 1726882751.13989: calling self._execute() 22225 1726882751.13996: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882751.14000: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882751.14006: variable 'omit' from source: magic vars 22225 1726882751.14408: variable 'ansible_distribution' from source: facts 22225 1726882751.14425: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 22225 1726882751.14431: when evaluation is False, skipping this task 22225 1726882751.14436: _execute() done 22225 1726882751.14439: dumping result to json 22225 1726882751.14443: done dumping result, returning 22225 1726882751.14454: done running TaskExecutor() for managed_node1/TASK: Install yum-utils package [0affc7ec-ae25-ec05-55b7-0000000000eb] 22225 1726882751.14460: sending task result for task 0affc7ec-ae25-ec05-55b7-0000000000eb 22225 1726882751.14660: done sending task result for task 0affc7ec-ae25-ec05-55b7-0000000000eb 22225 1726882751.14664: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 22225 1726882751.14704: no more pending results, returning what we have 22225 1726882751.14707: results queue empty 22225 1726882751.14708: checking for any_errors_fatal 22225 1726882751.14714: done checking for any_errors_fatal 22225 1726882751.14715: checking for max_fail_percentage 22225 1726882751.14716: done checking for max_fail_percentage 22225 1726882751.14717: checking to see if all hosts have failed and the running result is not ok 22225 1726882751.14718: done checking to see if all hosts have failed 22225 1726882751.14718: getting the remaining hosts for this loop 22225 1726882751.14720: done getting the remaining hosts for this loop 22225 1726882751.14724: getting the next task for host managed_node1 22225 1726882751.14730: done getting next task for host managed_node1 22225 1726882751.14732: ^ task is: TASK: Enable EPEL 7 22225 1726882751.14736: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882751.14739: getting variables 22225 1726882751.14740: in VariableManager get_vars() 22225 1726882751.14763: Calling all_inventory to load vars for managed_node1 22225 1726882751.14765: Calling groups_inventory to load vars for managed_node1 22225 1726882751.14768: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882751.14779: Calling all_plugins_play to load vars for managed_node1 22225 1726882751.14784: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882751.14788: Calling groups_plugins_play to load vars for managed_node1 22225 1726882751.15011: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882751.15235: done with get_vars() 22225 1726882751.15246: done getting variables 22225 1726882751.15307: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Friday 20 September 2024 21:39:11 -0400 (0:00:00.021) 0:00:06.546 ****** 22225 1726882751.15339: entering _queue_task() for managed_node1/command 22225 1726882751.15756: worker is 1 (out of 1 available) 22225 1726882751.15766: exiting _queue_task() for managed_node1/command 22225 1726882751.15775: done queuing things up, now waiting for results queue to drain 22225 1726882751.15776: waiting for pending results... 22225 1726882751.15884: running TaskExecutor() for managed_node1/TASK: Enable EPEL 7 22225 1726882751.15977: in run() - task 0affc7ec-ae25-ec05-55b7-0000000000ec 22225 1726882751.16015: variable 'ansible_search_path' from source: unknown 22225 1726882751.16020: variable 'ansible_search_path' from source: unknown 22225 1726882751.16030: calling self._execute() 22225 1726882751.16096: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882751.16103: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882751.16113: variable 'omit' from source: magic vars 22225 1726882751.16510: variable 'ansible_distribution' from source: facts 22225 1726882751.16529: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 22225 1726882751.16624: when evaluation is False, skipping this task 22225 1726882751.16635: _execute() done 22225 1726882751.16637: dumping result to json 22225 1726882751.16640: done dumping result, returning 22225 1726882751.16642: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 7 [0affc7ec-ae25-ec05-55b7-0000000000ec] 22225 1726882751.16646: sending task result for task 0affc7ec-ae25-ec05-55b7-0000000000ec 22225 1726882751.16707: done sending task result for task 0affc7ec-ae25-ec05-55b7-0000000000ec 22225 1726882751.16710: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 22225 1726882751.16773: no more pending results, returning what we have 22225 1726882751.16776: results queue empty 22225 1726882751.16777: checking for any_errors_fatal 22225 1726882751.16784: done checking for any_errors_fatal 22225 1726882751.16785: checking for max_fail_percentage 22225 1726882751.16787: done checking for max_fail_percentage 22225 1726882751.16788: checking to see if all hosts have failed and the running result is not ok 22225 1726882751.16788: done checking to see if all hosts have failed 22225 1726882751.16789: getting the remaining hosts for this loop 22225 1726882751.16790: done getting the remaining hosts for this loop 22225 1726882751.16794: getting the next task for host managed_node1 22225 1726882751.16800: done getting next task for host managed_node1 22225 1726882751.16802: ^ task is: TASK: Enable EPEL 8 22225 1726882751.16806: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882751.16809: getting variables 22225 1726882751.16811: in VariableManager get_vars() 22225 1726882751.16838: Calling all_inventory to load vars for managed_node1 22225 1726882751.16840: Calling groups_inventory to load vars for managed_node1 22225 1726882751.16843: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882751.16853: Calling all_plugins_play to load vars for managed_node1 22225 1726882751.16856: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882751.16859: Calling groups_plugins_play to load vars for managed_node1 22225 1726882751.17128: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882751.17367: done with get_vars() 22225 1726882751.17376: done getting variables 22225 1726882751.17438: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Friday 20 September 2024 21:39:11 -0400 (0:00:00.021) 0:00:06.567 ****** 22225 1726882751.17468: entering _queue_task() for managed_node1/command 22225 1726882751.17705: worker is 1 (out of 1 available) 22225 1726882751.17719: exiting _queue_task() for managed_node1/command 22225 1726882751.17834: done queuing things up, now waiting for results queue to drain 22225 1726882751.17836: waiting for pending results... 22225 1726882751.18441: running TaskExecutor() for managed_node1/TASK: Enable EPEL 8 22225 1726882751.18446: in run() - task 0affc7ec-ae25-ec05-55b7-0000000000ed 22225 1726882751.18449: variable 'ansible_search_path' from source: unknown 22225 1726882751.18452: variable 'ansible_search_path' from source: unknown 22225 1726882751.18455: calling self._execute() 22225 1726882751.18457: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882751.18460: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882751.18462: variable 'omit' from source: magic vars 22225 1726882751.18623: variable 'ansible_distribution' from source: facts 22225 1726882751.18648: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 22225 1726882751.18651: when evaluation is False, skipping this task 22225 1726882751.18653: _execute() done 22225 1726882751.18656: dumping result to json 22225 1726882751.18658: done dumping result, returning 22225 1726882751.18662: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 8 [0affc7ec-ae25-ec05-55b7-0000000000ed] 22225 1726882751.18664: sending task result for task 0affc7ec-ae25-ec05-55b7-0000000000ed 22225 1726882751.18837: done sending task result for task 0affc7ec-ae25-ec05-55b7-0000000000ed 22225 1726882751.18841: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 22225 1726882751.18877: no more pending results, returning what we have 22225 1726882751.18880: results queue empty 22225 1726882751.18881: checking for any_errors_fatal 22225 1726882751.18887: done checking for any_errors_fatal 22225 1726882751.18888: checking for max_fail_percentage 22225 1726882751.18889: done checking for max_fail_percentage 22225 1726882751.18890: checking to see if all hosts have failed and the running result is not ok 22225 1726882751.18891: done checking to see if all hosts have failed 22225 1726882751.18892: getting the remaining hosts for this loop 22225 1726882751.18893: done getting the remaining hosts for this loop 22225 1726882751.18897: getting the next task for host managed_node1 22225 1726882751.18904: done getting next task for host managed_node1 22225 1726882751.18907: ^ task is: TASK: Enable EPEL 6 22225 1726882751.18910: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882751.18913: getting variables 22225 1726882751.18914: in VariableManager get_vars() 22225 1726882751.18942: Calling all_inventory to load vars for managed_node1 22225 1726882751.18944: Calling groups_inventory to load vars for managed_node1 22225 1726882751.18948: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882751.18957: Calling all_plugins_play to load vars for managed_node1 22225 1726882751.18960: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882751.18963: Calling groups_plugins_play to load vars for managed_node1 22225 1726882751.19201: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882751.19442: done with get_vars() 22225 1726882751.19452: done getting variables 22225 1726882751.19510: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Friday 20 September 2024 21:39:11 -0400 (0:00:00.020) 0:00:06.588 ****** 22225 1726882751.19540: entering _queue_task() for managed_node1/copy 22225 1726882751.19764: worker is 1 (out of 1 available) 22225 1726882751.19777: exiting _queue_task() for managed_node1/copy 22225 1726882751.19792: done queuing things up, now waiting for results queue to drain 22225 1726882751.19793: waiting for pending results... 22225 1726882751.20042: running TaskExecutor() for managed_node1/TASK: Enable EPEL 6 22225 1726882751.20146: in run() - task 0affc7ec-ae25-ec05-55b7-0000000000ef 22225 1726882751.20163: variable 'ansible_search_path' from source: unknown 22225 1726882751.20167: variable 'ansible_search_path' from source: unknown 22225 1726882751.20203: calling self._execute() 22225 1726882751.20281: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882751.20285: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882751.20297: variable 'omit' from source: magic vars 22225 1726882751.20971: variable 'ansible_distribution' from source: facts 22225 1726882751.21048: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 22225 1726882751.21051: when evaluation is False, skipping this task 22225 1726882751.21054: _execute() done 22225 1726882751.21056: dumping result to json 22225 1726882751.21057: done dumping result, returning 22225 1726882751.21060: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 6 [0affc7ec-ae25-ec05-55b7-0000000000ef] 22225 1726882751.21062: sending task result for task 0affc7ec-ae25-ec05-55b7-0000000000ef 22225 1726882751.21129: done sending task result for task 0affc7ec-ae25-ec05-55b7-0000000000ef 22225 1726882751.21132: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 22225 1726882751.21176: no more pending results, returning what we have 22225 1726882751.21179: results queue empty 22225 1726882751.21180: checking for any_errors_fatal 22225 1726882751.21186: done checking for any_errors_fatal 22225 1726882751.21188: checking for max_fail_percentage 22225 1726882751.21189: done checking for max_fail_percentage 22225 1726882751.21190: checking to see if all hosts have failed and the running result is not ok 22225 1726882751.21191: done checking to see if all hosts have failed 22225 1726882751.21192: getting the remaining hosts for this loop 22225 1726882751.21193: done getting the remaining hosts for this loop 22225 1726882751.21197: getting the next task for host managed_node1 22225 1726882751.21205: done getting next task for host managed_node1 22225 1726882751.21207: ^ task is: TASK: Set network provider to 'nm' 22225 1726882751.21210: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882751.21214: getting variables 22225 1726882751.21215: in VariableManager get_vars() 22225 1726882751.21246: Calling all_inventory to load vars for managed_node1 22225 1726882751.21248: Calling groups_inventory to load vars for managed_node1 22225 1726882751.21252: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882751.21263: Calling all_plugins_play to load vars for managed_node1 22225 1726882751.21266: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882751.21270: Calling groups_plugins_play to load vars for managed_node1 22225 1726882751.21744: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882751.21975: done with get_vars() 22225 1726882751.21986: done getting variables 22225 1726882751.22044: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_nm.yml:13 Friday 20 September 2024 21:39:11 -0400 (0:00:00.025) 0:00:06.613 ****** 22225 1726882751.22069: entering _queue_task() for managed_node1/set_fact 22225 1726882751.22288: worker is 1 (out of 1 available) 22225 1726882751.22300: exiting _queue_task() for managed_node1/set_fact 22225 1726882751.22311: done queuing things up, now waiting for results queue to drain 22225 1726882751.22312: waiting for pending results... 22225 1726882751.22639: running TaskExecutor() for managed_node1/TASK: Set network provider to 'nm' 22225 1726882751.22645: in run() - task 0affc7ec-ae25-ec05-55b7-000000000007 22225 1726882751.22660: variable 'ansible_search_path' from source: unknown 22225 1726882751.22698: calling self._execute() 22225 1726882751.22777: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882751.22784: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882751.22903: variable 'omit' from source: magic vars 22225 1726882751.22906: variable 'omit' from source: magic vars 22225 1726882751.22941: variable 'omit' from source: magic vars 22225 1726882751.22980: variable 'omit' from source: magic vars 22225 1726882751.23026: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22225 1726882751.23067: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22225 1726882751.23089: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22225 1726882751.23107: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882751.23119: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882751.23154: variable 'inventory_hostname' from source: host vars for 'managed_node1' 22225 1726882751.23158: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882751.23160: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882751.23265: Set connection var ansible_connection to ssh 22225 1726882751.23276: Set connection var ansible_pipelining to False 22225 1726882751.23287: Set connection var ansible_shell_executable to /bin/sh 22225 1726882751.23294: Set connection var ansible_timeout to 10 22225 1726882751.23297: Set connection var ansible_shell_type to sh 22225 1726882751.23303: Set connection var ansible_module_compression to ZIP_DEFLATED 22225 1726882751.23524: variable 'ansible_shell_executable' from source: unknown 22225 1726882751.23530: variable 'ansible_connection' from source: unknown 22225 1726882751.23534: variable 'ansible_module_compression' from source: unknown 22225 1726882751.23537: variable 'ansible_shell_type' from source: unknown 22225 1726882751.23540: variable 'ansible_shell_executable' from source: unknown 22225 1726882751.23543: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882751.23547: variable 'ansible_pipelining' from source: unknown 22225 1726882751.23549: variable 'ansible_timeout' from source: unknown 22225 1726882751.23552: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882751.23559: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22225 1726882751.23562: variable 'omit' from source: magic vars 22225 1726882751.23565: starting attempt loop 22225 1726882751.23567: running the handler 22225 1726882751.23569: handler run complete 22225 1726882751.23572: attempt loop complete, returning result 22225 1726882751.23574: _execute() done 22225 1726882751.23576: dumping result to json 22225 1726882751.23578: done dumping result, returning 22225 1726882751.23581: done running TaskExecutor() for managed_node1/TASK: Set network provider to 'nm' [0affc7ec-ae25-ec05-55b7-000000000007] 22225 1726882751.23583: sending task result for task 0affc7ec-ae25-ec05-55b7-000000000007 22225 1726882751.23651: done sending task result for task 0affc7ec-ae25-ec05-55b7-000000000007 22225 1726882751.23654: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 22225 1726882751.23715: no more pending results, returning what we have 22225 1726882751.23718: results queue empty 22225 1726882751.23719: checking for any_errors_fatal 22225 1726882751.23727: done checking for any_errors_fatal 22225 1726882751.23728: checking for max_fail_percentage 22225 1726882751.23730: done checking for max_fail_percentage 22225 1726882751.23730: checking to see if all hosts have failed and the running result is not ok 22225 1726882751.23731: done checking to see if all hosts have failed 22225 1726882751.23732: getting the remaining hosts for this loop 22225 1726882751.23733: done getting the remaining hosts for this loop 22225 1726882751.23737: getting the next task for host managed_node1 22225 1726882751.23744: done getting next task for host managed_node1 22225 1726882751.23746: ^ task is: TASK: meta (flush_handlers) 22225 1726882751.23748: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882751.23752: getting variables 22225 1726882751.23753: in VariableManager get_vars() 22225 1726882751.23786: Calling all_inventory to load vars for managed_node1 22225 1726882751.23790: Calling groups_inventory to load vars for managed_node1 22225 1726882751.23793: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882751.23807: Calling all_plugins_play to load vars for managed_node1 22225 1726882751.23810: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882751.23813: Calling groups_plugins_play to load vars for managed_node1 22225 1726882751.24045: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882751.24278: done with get_vars() 22225 1726882751.24287: done getting variables 22225 1726882751.24364: in VariableManager get_vars() 22225 1726882751.24373: Calling all_inventory to load vars for managed_node1 22225 1726882751.24375: Calling groups_inventory to load vars for managed_node1 22225 1726882751.24378: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882751.24382: Calling all_plugins_play to load vars for managed_node1 22225 1726882751.24384: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882751.24387: Calling groups_plugins_play to load vars for managed_node1 22225 1726882751.24587: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882751.24830: done with get_vars() 22225 1726882751.24844: done queuing things up, now waiting for results queue to drain 22225 1726882751.24846: results queue empty 22225 1726882751.24847: checking for any_errors_fatal 22225 1726882751.24849: done checking for any_errors_fatal 22225 1726882751.24850: checking for max_fail_percentage 22225 1726882751.24851: done checking for max_fail_percentage 22225 1726882751.24851: checking to see if all hosts have failed and the running result is not ok 22225 1726882751.24852: done checking to see if all hosts have failed 22225 1726882751.24853: getting the remaining hosts for this loop 22225 1726882751.24854: done getting the remaining hosts for this loop 22225 1726882751.24856: getting the next task for host managed_node1 22225 1726882751.24865: done getting next task for host managed_node1 22225 1726882751.24867: ^ task is: TASK: meta (flush_handlers) 22225 1726882751.24868: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882751.24880: getting variables 22225 1726882751.24881: in VariableManager get_vars() 22225 1726882751.24889: Calling all_inventory to load vars for managed_node1 22225 1726882751.24891: Calling groups_inventory to load vars for managed_node1 22225 1726882751.24893: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882751.24899: Calling all_plugins_play to load vars for managed_node1 22225 1726882751.24901: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882751.24904: Calling groups_plugins_play to load vars for managed_node1 22225 1726882751.25073: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882751.25313: done with get_vars() 22225 1726882751.25321: done getting variables 22225 1726882751.25367: in VariableManager get_vars() 22225 1726882751.25375: Calling all_inventory to load vars for managed_node1 22225 1726882751.25377: Calling groups_inventory to load vars for managed_node1 22225 1726882751.25380: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882751.25384: Calling all_plugins_play to load vars for managed_node1 22225 1726882751.25386: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882751.25390: Calling groups_plugins_play to load vars for managed_node1 22225 1726882751.25582: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882751.25812: done with get_vars() 22225 1726882751.25825: done queuing things up, now waiting for results queue to drain 22225 1726882751.25827: results queue empty 22225 1726882751.25827: checking for any_errors_fatal 22225 1726882751.25829: done checking for any_errors_fatal 22225 1726882751.25829: checking for max_fail_percentage 22225 1726882751.25830: done checking for max_fail_percentage 22225 1726882751.25831: checking to see if all hosts have failed and the running result is not ok 22225 1726882751.25832: done checking to see if all hosts have failed 22225 1726882751.25833: getting the remaining hosts for this loop 22225 1726882751.25833: done getting the remaining hosts for this loop 22225 1726882751.25836: getting the next task for host managed_node1 22225 1726882751.25839: done getting next task for host managed_node1 22225 1726882751.25839: ^ task is: None 22225 1726882751.25845: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882751.25847: done queuing things up, now waiting for results queue to drain 22225 1726882751.25848: results queue empty 22225 1726882751.25849: checking for any_errors_fatal 22225 1726882751.25849: done checking for any_errors_fatal 22225 1726882751.25850: checking for max_fail_percentage 22225 1726882751.25851: done checking for max_fail_percentage 22225 1726882751.25852: checking to see if all hosts have failed and the running result is not ok 22225 1726882751.25852: done checking to see if all hosts have failed 22225 1726882751.25854: getting the next task for host managed_node1 22225 1726882751.25861: done getting next task for host managed_node1 22225 1726882751.25862: ^ task is: None 22225 1726882751.25863: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882751.25911: in VariableManager get_vars() 22225 1726882751.25938: done with get_vars() 22225 1726882751.25945: in VariableManager get_vars() 22225 1726882751.25971: done with get_vars() 22225 1726882751.25976: variable 'omit' from source: magic vars 22225 1726882751.26008: in VariableManager get_vars() 22225 1726882751.26027: done with get_vars() 22225 1726882751.26051: variable 'omit' from source: magic vars PLAY [Play for testing IPv6 config] ******************************************** 22225 1726882751.26677: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 22225 1726882751.26704: getting the remaining hosts for this loop 22225 1726882751.26706: done getting the remaining hosts for this loop 22225 1726882751.26708: getting the next task for host managed_node1 22225 1726882751.26711: done getting next task for host managed_node1 22225 1726882751.26713: ^ task is: TASK: Gathering Facts 22225 1726882751.26714: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882751.26716: getting variables 22225 1726882751.26717: in VariableManager get_vars() 22225 1726882751.26733: Calling all_inventory to load vars for managed_node1 22225 1726882751.26735: Calling groups_inventory to load vars for managed_node1 22225 1726882751.26737: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882751.26741: Calling all_plugins_play to load vars for managed_node1 22225 1726882751.26756: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882751.26760: Calling groups_plugins_play to load vars for managed_node1 22225 1726882751.26933: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882751.27169: done with get_vars() 22225 1726882751.27177: done getting variables 22225 1726882751.27220: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:3 Friday 20 September 2024 21:39:11 -0400 (0:00:00.051) 0:00:06.665 ****** 22225 1726882751.27246: entering _queue_task() for managed_node1/gather_facts 22225 1726882751.27499: worker is 1 (out of 1 available) 22225 1726882751.27624: exiting _queue_task() for managed_node1/gather_facts 22225 1726882751.27638: done queuing things up, now waiting for results queue to drain 22225 1726882751.27639: waiting for pending results... 22225 1726882751.27791: running TaskExecutor() for managed_node1/TASK: Gathering Facts 22225 1726882751.27927: in run() - task 0affc7ec-ae25-ec05-55b7-000000000115 22225 1726882751.27931: variable 'ansible_search_path' from source: unknown 22225 1726882751.27959: calling self._execute() 22225 1726882751.28053: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882751.28080: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882751.28126: variable 'omit' from source: magic vars 22225 1726882751.28574: variable 'ansible_distribution_major_version' from source: facts 22225 1726882751.28591: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882751.28601: variable 'omit' from source: magic vars 22225 1726882751.28639: variable 'omit' from source: magic vars 22225 1726882751.28678: variable 'omit' from source: magic vars 22225 1726882751.28726: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22225 1726882751.28831: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22225 1726882751.28835: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22225 1726882751.28838: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882751.28840: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882751.28860: variable 'inventory_hostname' from source: host vars for 'managed_node1' 22225 1726882751.28868: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882751.28875: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882751.28980: Set connection var ansible_connection to ssh 22225 1726882751.28995: Set connection var ansible_pipelining to False 22225 1726882751.29007: Set connection var ansible_shell_executable to /bin/sh 22225 1726882751.29017: Set connection var ansible_timeout to 10 22225 1726882751.29025: Set connection var ansible_shell_type to sh 22225 1726882751.29035: Set connection var ansible_module_compression to ZIP_DEFLATED 22225 1726882751.29069: variable 'ansible_shell_executable' from source: unknown 22225 1726882751.29076: variable 'ansible_connection' from source: unknown 22225 1726882751.29083: variable 'ansible_module_compression' from source: unknown 22225 1726882751.29090: variable 'ansible_shell_type' from source: unknown 22225 1726882751.29096: variable 'ansible_shell_executable' from source: unknown 22225 1726882751.29102: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882751.29109: variable 'ansible_pipelining' from source: unknown 22225 1726882751.29115: variable 'ansible_timeout' from source: unknown 22225 1726882751.29124: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882751.29377: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22225 1726882751.29381: variable 'omit' from source: magic vars 22225 1726882751.29383: starting attempt loop 22225 1726882751.29386: running the handler 22225 1726882751.29388: variable 'ansible_facts' from source: unknown 22225 1726882751.29390: _low_level_execute_command(): starting 22225 1726882751.29399: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22225 1726882751.30253: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882751.30293: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882751.30320: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882751.30407: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882751.32157: stdout chunk (state=3): >>>/root <<< 22225 1726882751.32610: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882751.32621: stdout chunk (state=3): >>><<< 22225 1726882751.32635: stderr chunk (state=3): >>><<< 22225 1726882751.32664: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882751.32751: _low_level_execute_command(): starting 22225 1726882751.32756: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882751.3267033-22481-116316621414129 `" && echo ansible-tmp-1726882751.3267033-22481-116316621414129="` echo /root/.ansible/tmp/ansible-tmp-1726882751.3267033-22481-116316621414129 `" ) && sleep 0' 22225 1726882751.33335: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22225 1726882751.33347: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882751.33362: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882751.33380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882751.33410: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882751.33425: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 22225 1726882751.33524: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882751.33546: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882751.33563: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882751.33647: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882751.35642: stdout chunk (state=3): >>>ansible-tmp-1726882751.3267033-22481-116316621414129=/root/.ansible/tmp/ansible-tmp-1726882751.3267033-22481-116316621414129 <<< 22225 1726882751.35835: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882751.35845: stdout chunk (state=3): >>><<< 22225 1726882751.35857: stderr chunk (state=3): >>><<< 22225 1726882751.35878: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882751.3267033-22481-116316621414129=/root/.ansible/tmp/ansible-tmp-1726882751.3267033-22481-116316621414129 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882751.36027: variable 'ansible_module_compression' from source: unknown 22225 1726882751.36031: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-2222502xecypi/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 22225 1726882751.36049: variable 'ansible_facts' from source: unknown 22225 1726882751.36246: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882751.3267033-22481-116316621414129/AnsiballZ_setup.py 22225 1726882751.36499: Sending initial data 22225 1726882751.36502: Sent initial data (154 bytes) 22225 1726882751.37094: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22225 1726882751.37107: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882751.37133: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882751.37249: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882751.37273: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882751.37362: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882751.38979: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 22225 1726882751.38998: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22225 1726882751.39047: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22225 1726882751.39101: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2222502xecypi/tmpem11m2e3 /root/.ansible/tmp/ansible-tmp-1726882751.3267033-22481-116316621414129/AnsiballZ_setup.py <<< 22225 1726882751.39103: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882751.3267033-22481-116316621414129/AnsiballZ_setup.py" <<< 22225 1726882751.39146: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-2222502xecypi/tmpem11m2e3" to remote "/root/.ansible/tmp/ansible-tmp-1726882751.3267033-22481-116316621414129/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882751.3267033-22481-116316621414129/AnsiballZ_setup.py" <<< 22225 1726882751.41034: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882751.41038: stdout chunk (state=3): >>><<< 22225 1726882751.41041: stderr chunk (state=3): >>><<< 22225 1726882751.41043: done transferring module to remote 22225 1726882751.41045: _low_level_execute_command(): starting 22225 1726882751.41048: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882751.3267033-22481-116316621414129/ /root/.ansible/tmp/ansible-tmp-1726882751.3267033-22481-116316621414129/AnsiballZ_setup.py && sleep 0' 22225 1726882751.41885: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22225 1726882751.41997: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882751.42015: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882751.42036: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882751.42116: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882751.43982: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882751.44067: stderr chunk (state=3): >>><<< 22225 1726882751.44077: stdout chunk (state=3): >>><<< 22225 1726882751.44102: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882751.44111: _low_level_execute_command(): starting 22225 1726882751.44125: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882751.3267033-22481-116316621414129/AnsiballZ_setup.py && sleep 0' 22225 1726882751.44788: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882751.44793: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 <<< 22225 1726882751.44796: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882751.44800: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882751.44830: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882751.44897: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882751.44952: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882753.26007: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.10.9-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Sun Sep 8 17:23:55 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "ip-10-31-15-7.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-15-7", "ansible_nodename": "ip-10-31-15-7.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22da150943bd63960f74a27c08f190", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_iscsi_iqn": "", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDgEY5v0GEtCYtch7tJojQQAk4PbH2olIXeGHTQIm8P5pFfIFYEIOJVnRYq1iTcsBmLCTgtGM0uASSyO5gXiTJv1YA3W6bzq+KxIdoX/yvSXE7c8N6e/7sMCf9vq2o8xdS3RQTFVoQhj8zkjID057q3vE1D1ocBhYrbVTg76i1ZqUd+ePIBmv/FpJM5vb0YoL7gkfS3SFIJPuVIEqIfwZenELkhPa6MfTG3y/T8+Y4mRwbn7AmsxfBpApbj+TjvOT1vEjw0nBgVhen2pB+dpX/dtpPZiqrQgfcCF+gcf5MY2k2QbuXKKc1iESrpq3sm4as3n9bYr/2i3c3+5PTFN/CKnMpbejvhd+srQgr8UXt57pXIrXxBNe9thnfdeyp9mTxpprGooVC/CHvQ11TbipHGql4+nW9TSIg4s+WIoJGBb5REpn5hh2HmL0/W+Bhet//pxt9ENJxKyunBZToUbqQ2wjkR25JMThiZ6lKuzIRlIAK4i02pPGNUi28QgUDqdR8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKwVnf2m/NbfCFFK6wM50dwDuEJIaiTFh2d8mqI8HYkr65RHvztqJ3ibHa48thfVy5T7cZ8XqhpqkqfQd1OIshs=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIGUhLLUPg0qNUe+aRCNNpHWNDNP2CUtjsgQPqFujAjMC", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_local": {}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_lsb": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:b5954bb9-e972-4b2a-94f1-a82c77e96f77", "ansible_apparmor": {"status": "disabled"}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fips": false, "ansible_loadavg": {"1m": 0.69091796875, "5m": 0.615234375, "15m": 0.37255859375}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3065, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 651, "free": 3065}, "nocache": {"free": 3478, "used": 238}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansib<<< 22225 1726882753.26065: stdout chunk (state=3): >>>le_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22da15-0943-bd63-960f-74a27c08f190", "ansible_product_uuid": "ec22da15-0943-bd63-960f-74a27c08f190", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 711, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251373727744, "block_size": 4096, "block_total": 64483404, "block_available": 61370539, "block_used": 3112865, "inode_total": 16384000, "inode_available": 16303049, "inode_used": 80951, "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"}], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "39", "second": "13", "epoch": "1726882753", "epoch_int": "1726882753", "date": "2024-09-20", "time": "21:39:13", "iso8601_micro": "2024-09-21T01:39:13.216850Z", "iso8601": "2024-09-21T01:39:13Z", "iso8601_basic": "20240920T213913216850", "iso8601_basic_short": "20240920T213913", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_is_chroot": false, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.180 60558 10.31.15.7 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.180 60558 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fibre_channel_wwn": [], "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:c5:8e:44:af", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.7", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:c5ff:fe8e:44af", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.7", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:c5:8e:44:af", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.7"], "ansible_all_ipv6_addresses": ["fe80::8ff:c5ff:fe8e:44af"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.7", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:c5ff:fe8e:44af"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 22225 1726882753.28208: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882753.28429: stderr chunk (state=3): >>>Shared connection to 10.31.15.7 closed. <<< 22225 1726882753.28433: stdout chunk (state=3): >>><<< 22225 1726882753.28436: stderr chunk (state=3): >>><<< 22225 1726882753.28442: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.10.9-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Sun Sep 8 17:23:55 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "ip-10-31-15-7.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-15-7", "ansible_nodename": "ip-10-31-15-7.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec22da150943bd63960f74a27c08f190", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_iscsi_iqn": "", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDgEY5v0GEtCYtch7tJojQQAk4PbH2olIXeGHTQIm8P5pFfIFYEIOJVnRYq1iTcsBmLCTgtGM0uASSyO5gXiTJv1YA3W6bzq+KxIdoX/yvSXE7c8N6e/7sMCf9vq2o8xdS3RQTFVoQhj8zkjID057q3vE1D1ocBhYrbVTg76i1ZqUd+ePIBmv/FpJM5vb0YoL7gkfS3SFIJPuVIEqIfwZenELkhPa6MfTG3y/T8+Y4mRwbn7AmsxfBpApbj+TjvOT1vEjw0nBgVhen2pB+dpX/dtpPZiqrQgfcCF+gcf5MY2k2QbuXKKc1iESrpq3sm4as3n9bYr/2i3c3+5PTFN/CKnMpbejvhd+srQgr8UXt57pXIrXxBNe9thnfdeyp9mTxpprGooVC/CHvQ11TbipHGql4+nW9TSIg4s+WIoJGBb5REpn5hh2HmL0/W+Bhet//pxt9ENJxKyunBZToUbqQ2wjkR25JMThiZ6lKuzIRlIAK4i02pPGNUi28QgUDqdR8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKwVnf2m/NbfCFFK6wM50dwDuEJIaiTFh2d8mqI8HYkr65RHvztqJ3ibHa48thfVy5T7cZ8XqhpqkqfQd1OIshs=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIGUhLLUPg0qNUe+aRCNNpHWNDNP2CUtjsgQPqFujAjMC", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_local": {}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_lsb": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:b5954bb9-e972-4b2a-94f1-a82c77e96f77", "ansible_apparmor": {"status": "disabled"}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fips": false, "ansible_loadavg": {"1m": 0.69091796875, "5m": 0.615234375, "15m": 0.37255859375}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3065, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 651, "free": 3065}, "nocache": {"free": 3478, "used": 238}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec22da15-0943-bd63-960f-74a27c08f190", "ansible_product_uuid": "ec22da15-0943-bd63-960f-74a27c08f190", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 711, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264124022784, "size_available": 251373727744, "block_size": 4096, "block_total": 64483404, "block_available": 61370539, "block_used": 3112865, "inode_total": 16384000, "inode_available": 16303049, "inode_used": 80951, "uuid": "6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5"}], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "39", "second": "13", "epoch": "1726882753", "epoch_int": "1726882753", "date": "2024-09-20", "time": "21:39:13", "iso8601_micro": "2024-09-21T01:39:13.216850Z", "iso8601": "2024-09-21T01:39:13Z", "iso8601_basic": "20240920T213913216850", "iso8601_basic_short": "20240920T213913", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_is_chroot": false, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.13.180 60558 10.31.15.7 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.13.180 60558 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.9-200.fc40.x86_64", "root": "UUID=6bbeacc5-0b4e-4b46-9aaa-4bbfe8b0cfc5", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fibre_channel_wwn": [], "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:c5:8e:44:af", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.7", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:c5ff:fe8e:44af", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.7", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:c5:8e:44:af", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.7"], "ansible_all_ipv6_addresses": ["fe80::8ff:c5ff:fe8e:44af"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.7", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:c5ff:fe8e:44af"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. 22225 1726882753.28708: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882751.3267033-22481-116316621414129/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22225 1726882753.28740: _low_level_execute_command(): starting 22225 1726882753.28750: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882751.3267033-22481-116316621414129/ > /dev/null 2>&1 && sleep 0' 22225 1726882753.29409: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22225 1726882753.29428: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882753.29449: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882753.29546: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882753.29569: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882753.29663: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882753.31635: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882753.31646: stdout chunk (state=3): >>><<< 22225 1726882753.31661: stderr chunk (state=3): >>><<< 22225 1726882753.31681: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882753.31696: handler run complete 22225 1726882753.31861: variable 'ansible_facts' from source: unknown 22225 1726882753.31984: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882753.32396: variable 'ansible_facts' from source: unknown 22225 1726882753.32506: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882753.32676: attempt loop complete, returning result 22225 1726882753.32686: _execute() done 22225 1726882753.32692: dumping result to json 22225 1726882753.32738: done dumping result, returning 22225 1726882753.32750: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0affc7ec-ae25-ec05-55b7-000000000115] 22225 1726882753.32759: sending task result for task 0affc7ec-ae25-ec05-55b7-000000000115 ok: [managed_node1] 22225 1726882753.33827: no more pending results, returning what we have 22225 1726882753.33832: results queue empty 22225 1726882753.33833: checking for any_errors_fatal 22225 1726882753.33834: done checking for any_errors_fatal 22225 1726882753.33835: checking for max_fail_percentage 22225 1726882753.33837: done checking for max_fail_percentage 22225 1726882753.33837: checking to see if all hosts have failed and the running result is not ok 22225 1726882753.33838: done checking to see if all hosts have failed 22225 1726882753.33839: getting the remaining hosts for this loop 22225 1726882753.33840: done getting the remaining hosts for this loop 22225 1726882753.33844: getting the next task for host managed_node1 22225 1726882753.33849: done getting next task for host managed_node1 22225 1726882753.33850: ^ task is: TASK: meta (flush_handlers) 22225 1726882753.33853: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882753.33856: getting variables 22225 1726882753.33857: in VariableManager get_vars() 22225 1726882753.33886: Calling all_inventory to load vars for managed_node1 22225 1726882753.33889: Calling groups_inventory to load vars for managed_node1 22225 1726882753.33891: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882753.33898: done sending task result for task 0affc7ec-ae25-ec05-55b7-000000000115 22225 1726882753.33901: WORKER PROCESS EXITING 22225 1726882753.33910: Calling all_plugins_play to load vars for managed_node1 22225 1726882753.33913: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882753.33917: Calling groups_plugins_play to load vars for managed_node1 22225 1726882753.34104: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882753.34350: done with get_vars() 22225 1726882753.34361: done getting variables 22225 1726882753.34443: in VariableManager get_vars() 22225 1726882753.34459: Calling all_inventory to load vars for managed_node1 22225 1726882753.34462: Calling groups_inventory to load vars for managed_node1 22225 1726882753.34464: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882753.34469: Calling all_plugins_play to load vars for managed_node1 22225 1726882753.34472: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882753.34480: Calling groups_plugins_play to load vars for managed_node1 22225 1726882753.34651: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882753.34870: done with get_vars() 22225 1726882753.34884: done queuing things up, now waiting for results queue to drain 22225 1726882753.34886: results queue empty 22225 1726882753.34887: checking for any_errors_fatal 22225 1726882753.34890: done checking for any_errors_fatal 22225 1726882753.34891: checking for max_fail_percentage 22225 1726882753.34892: done checking for max_fail_percentage 22225 1726882753.34894: checking to see if all hosts have failed and the running result is not ok 22225 1726882753.34899: done checking to see if all hosts have failed 22225 1726882753.34900: getting the remaining hosts for this loop 22225 1726882753.34901: done getting the remaining hosts for this loop 22225 1726882753.34904: getting the next task for host managed_node1 22225 1726882753.34908: done getting next task for host managed_node1 22225 1726882753.34916: ^ task is: TASK: Include the task 'show_interfaces.yml' 22225 1726882753.34918: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882753.34920: getting variables 22225 1726882753.34921: in VariableManager get_vars() 22225 1726882753.34938: Calling all_inventory to load vars for managed_node1 22225 1726882753.34940: Calling groups_inventory to load vars for managed_node1 22225 1726882753.34942: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882753.34947: Calling all_plugins_play to load vars for managed_node1 22225 1726882753.34950: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882753.34952: Calling groups_plugins_play to load vars for managed_node1 22225 1726882753.35125: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882753.35364: done with get_vars() 22225 1726882753.35372: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:9 Friday 20 September 2024 21:39:13 -0400 (0:00:02.082) 0:00:08.747 ****** 22225 1726882753.35452: entering _queue_task() for managed_node1/include_tasks 22225 1726882753.35772: worker is 1 (out of 1 available) 22225 1726882753.35899: exiting _queue_task() for managed_node1/include_tasks 22225 1726882753.35912: done queuing things up, now waiting for results queue to drain 22225 1726882753.35913: waiting for pending results... 22225 1726882753.36092: running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' 22225 1726882753.36204: in run() - task 0affc7ec-ae25-ec05-55b7-00000000000b 22225 1726882753.36233: variable 'ansible_search_path' from source: unknown 22225 1726882753.36279: calling self._execute() 22225 1726882753.36382: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882753.36396: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882753.36411: variable 'omit' from source: magic vars 22225 1726882753.36814: variable 'ansible_distribution_major_version' from source: facts 22225 1726882753.36833: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882753.36843: _execute() done 22225 1726882753.36850: dumping result to json 22225 1726882753.36856: done dumping result, returning 22225 1726882753.36864: done running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' [0affc7ec-ae25-ec05-55b7-00000000000b] 22225 1726882753.36878: sending task result for task 0affc7ec-ae25-ec05-55b7-00000000000b 22225 1726882753.37232: no more pending results, returning what we have 22225 1726882753.37238: in VariableManager get_vars() 22225 1726882753.37275: Calling all_inventory to load vars for managed_node1 22225 1726882753.37277: Calling groups_inventory to load vars for managed_node1 22225 1726882753.37279: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882753.37289: Calling all_plugins_play to load vars for managed_node1 22225 1726882753.37292: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882753.37296: Calling groups_plugins_play to load vars for managed_node1 22225 1726882753.37520: done sending task result for task 0affc7ec-ae25-ec05-55b7-00000000000b 22225 1726882753.37528: WORKER PROCESS EXITING 22225 1726882753.37552: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882753.37792: done with get_vars() 22225 1726882753.37800: variable 'ansible_search_path' from source: unknown 22225 1726882753.37812: we have included files to process 22225 1726882753.37814: generating all_blocks data 22225 1726882753.37815: done generating all_blocks data 22225 1726882753.37816: processing included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 22225 1726882753.37817: loading included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 22225 1726882753.37819: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 22225 1726882753.37981: in VariableManager get_vars() 22225 1726882753.38004: done with get_vars() 22225 1726882753.38119: done processing included file 22225 1726882753.38121: iterating over new_blocks loaded from include file 22225 1726882753.38124: in VariableManager get_vars() 22225 1726882753.38141: done with get_vars() 22225 1726882753.38143: filtering new block on tags 22225 1726882753.38159: done filtering new block on tags 22225 1726882753.38162: done iterating over new_blocks loaded from include file included: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node1 22225 1726882753.38166: extending task lists for all hosts with included blocks 22225 1726882753.38242: done extending task lists 22225 1726882753.38243: done processing included files 22225 1726882753.38244: results queue empty 22225 1726882753.38245: checking for any_errors_fatal 22225 1726882753.38246: done checking for any_errors_fatal 22225 1726882753.38247: checking for max_fail_percentage 22225 1726882753.38248: done checking for max_fail_percentage 22225 1726882753.38249: checking to see if all hosts have failed and the running result is not ok 22225 1726882753.38250: done checking to see if all hosts have failed 22225 1726882753.38250: getting the remaining hosts for this loop 22225 1726882753.38252: done getting the remaining hosts for this loop 22225 1726882753.38254: getting the next task for host managed_node1 22225 1726882753.38257: done getting next task for host managed_node1 22225 1726882753.38259: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 22225 1726882753.38262: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882753.38264: getting variables 22225 1726882753.38265: in VariableManager get_vars() 22225 1726882753.38278: Calling all_inventory to load vars for managed_node1 22225 1726882753.38280: Calling groups_inventory to load vars for managed_node1 22225 1726882753.38282: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882753.38292: Calling all_plugins_play to load vars for managed_node1 22225 1726882753.38294: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882753.38297: Calling groups_plugins_play to load vars for managed_node1 22225 1726882753.38467: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882753.38736: done with get_vars() 22225 1726882753.38745: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 21:39:13 -0400 (0:00:00.033) 0:00:08.780 ****** 22225 1726882753.38812: entering _queue_task() for managed_node1/include_tasks 22225 1726882753.39054: worker is 1 (out of 1 available) 22225 1726882753.39067: exiting _queue_task() for managed_node1/include_tasks 22225 1726882753.39078: done queuing things up, now waiting for results queue to drain 22225 1726882753.39080: waiting for pending results... 22225 1726882753.39325: running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' 22225 1726882753.39450: in run() - task 0affc7ec-ae25-ec05-55b7-00000000012b 22225 1726882753.39454: variable 'ansible_search_path' from source: unknown 22225 1726882753.39527: variable 'ansible_search_path' from source: unknown 22225 1726882753.39530: calling self._execute() 22225 1726882753.39586: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882753.39597: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882753.39610: variable 'omit' from source: magic vars 22225 1726882753.39990: variable 'ansible_distribution_major_version' from source: facts 22225 1726882753.40011: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882753.40021: _execute() done 22225 1726882753.40031: dumping result to json 22225 1726882753.40040: done dumping result, returning 22225 1726882753.40050: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' [0affc7ec-ae25-ec05-55b7-00000000012b] 22225 1726882753.40059: sending task result for task 0affc7ec-ae25-ec05-55b7-00000000012b 22225 1726882753.40292: no more pending results, returning what we have 22225 1726882753.40297: in VariableManager get_vars() 22225 1726882753.40342: Calling all_inventory to load vars for managed_node1 22225 1726882753.40345: Calling groups_inventory to load vars for managed_node1 22225 1726882753.40347: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882753.40359: Calling all_plugins_play to load vars for managed_node1 22225 1726882753.40362: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882753.40365: Calling groups_plugins_play to load vars for managed_node1 22225 1726882753.40627: done sending task result for task 0affc7ec-ae25-ec05-55b7-00000000012b 22225 1726882753.40631: WORKER PROCESS EXITING 22225 1726882753.40658: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882753.40900: done with get_vars() 22225 1726882753.40908: variable 'ansible_search_path' from source: unknown 22225 1726882753.40909: variable 'ansible_search_path' from source: unknown 22225 1726882753.40949: we have included files to process 22225 1726882753.40950: generating all_blocks data 22225 1726882753.40951: done generating all_blocks data 22225 1726882753.40953: processing included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 22225 1726882753.40954: loading included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 22225 1726882753.40956: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 22225 1726882753.41261: done processing included file 22225 1726882753.41263: iterating over new_blocks loaded from include file 22225 1726882753.41264: in VariableManager get_vars() 22225 1726882753.41284: done with get_vars() 22225 1726882753.41286: filtering new block on tags 22225 1726882753.41310: done filtering new block on tags 22225 1726882753.41313: done iterating over new_blocks loaded from include file included: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node1 22225 1726882753.41317: extending task lists for all hosts with included blocks 22225 1726882753.41438: done extending task lists 22225 1726882753.41439: done processing included files 22225 1726882753.41440: results queue empty 22225 1726882753.41441: checking for any_errors_fatal 22225 1726882753.41444: done checking for any_errors_fatal 22225 1726882753.41445: checking for max_fail_percentage 22225 1726882753.41446: done checking for max_fail_percentage 22225 1726882753.41447: checking to see if all hosts have failed and the running result is not ok 22225 1726882753.41448: done checking to see if all hosts have failed 22225 1726882753.41448: getting the remaining hosts for this loop 22225 1726882753.41450: done getting the remaining hosts for this loop 22225 1726882753.41453: getting the next task for host managed_node1 22225 1726882753.41457: done getting next task for host managed_node1 22225 1726882753.41459: ^ task is: TASK: Gather current interface info 22225 1726882753.41462: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882753.41465: getting variables 22225 1726882753.41466: in VariableManager get_vars() 22225 1726882753.41479: Calling all_inventory to load vars for managed_node1 22225 1726882753.41481: Calling groups_inventory to load vars for managed_node1 22225 1726882753.41483: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882753.41488: Calling all_plugins_play to load vars for managed_node1 22225 1726882753.41491: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882753.41494: Calling groups_plugins_play to load vars for managed_node1 22225 1726882753.41693: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882753.41932: done with get_vars() 22225 1726882753.41941: done getting variables 22225 1726882753.41987: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 21:39:13 -0400 (0:00:00.032) 0:00:08.812 ****** 22225 1726882753.42016: entering _queue_task() for managed_node1/command 22225 1726882753.42321: worker is 1 (out of 1 available) 22225 1726882753.42335: exiting _queue_task() for managed_node1/command 22225 1726882753.42346: done queuing things up, now waiting for results queue to drain 22225 1726882753.42348: waiting for pending results... 22225 1726882753.42561: running TaskExecutor() for managed_node1/TASK: Gather current interface info 22225 1726882753.42687: in run() - task 0affc7ec-ae25-ec05-55b7-00000000013a 22225 1726882753.42707: variable 'ansible_search_path' from source: unknown 22225 1726882753.42723: variable 'ansible_search_path' from source: unknown 22225 1726882753.42830: calling self._execute() 22225 1726882753.42853: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882753.42865: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882753.42878: variable 'omit' from source: magic vars 22225 1726882753.43266: variable 'ansible_distribution_major_version' from source: facts 22225 1726882753.43286: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882753.43298: variable 'omit' from source: magic vars 22225 1726882753.43351: variable 'omit' from source: magic vars 22225 1726882753.43398: variable 'omit' from source: magic vars 22225 1726882753.43448: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22225 1726882753.43496: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22225 1726882753.43524: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22225 1726882753.43593: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882753.43596: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882753.43602: variable 'inventory_hostname' from source: host vars for 'managed_node1' 22225 1726882753.43611: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882753.43619: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882753.43732: Set connection var ansible_connection to ssh 22225 1726882753.43749: Set connection var ansible_pipelining to False 22225 1726882753.43763: Set connection var ansible_shell_executable to /bin/sh 22225 1726882753.43773: Set connection var ansible_timeout to 10 22225 1726882753.43810: Set connection var ansible_shell_type to sh 22225 1726882753.43813: Set connection var ansible_module_compression to ZIP_DEFLATED 22225 1726882753.43828: variable 'ansible_shell_executable' from source: unknown 22225 1726882753.43839: variable 'ansible_connection' from source: unknown 22225 1726882753.43848: variable 'ansible_module_compression' from source: unknown 22225 1726882753.43855: variable 'ansible_shell_type' from source: unknown 22225 1726882753.43919: variable 'ansible_shell_executable' from source: unknown 22225 1726882753.43922: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882753.43928: variable 'ansible_pipelining' from source: unknown 22225 1726882753.43930: variable 'ansible_timeout' from source: unknown 22225 1726882753.43932: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882753.44048: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22225 1726882753.44067: variable 'omit' from source: magic vars 22225 1726882753.44079: starting attempt loop 22225 1726882753.44086: running the handler 22225 1726882753.44105: _low_level_execute_command(): starting 22225 1726882753.44115: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22225 1726882753.44845: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22225 1726882753.44859: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882753.44871: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882753.44888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882753.44908: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 <<< 22225 1726882753.45014: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882753.45042: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882753.45138: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882753.46916: stdout chunk (state=3): >>>/root <<< 22225 1726882753.47344: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882753.47347: stdout chunk (state=3): >>><<< 22225 1726882753.47350: stderr chunk (state=3): >>><<< 22225 1726882753.47353: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882753.47356: _low_level_execute_command(): starting 22225 1726882753.47359: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882753.472472-22533-69801173746487 `" && echo ansible-tmp-1726882753.472472-22533-69801173746487="` echo /root/.ansible/tmp/ansible-tmp-1726882753.472472-22533-69801173746487 `" ) && sleep 0' 22225 1726882753.48480: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882753.48503: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882753.48809: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882753.48955: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882753.50958: stdout chunk (state=3): >>>ansible-tmp-1726882753.472472-22533-69801173746487=/root/.ansible/tmp/ansible-tmp-1726882753.472472-22533-69801173746487 <<< 22225 1726882753.51330: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882753.51333: stdout chunk (state=3): >>><<< 22225 1726882753.51404: stderr chunk (state=3): >>><<< 22225 1726882753.51408: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882753.472472-22533-69801173746487=/root/.ansible/tmp/ansible-tmp-1726882753.472472-22533-69801173746487 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882753.51411: variable 'ansible_module_compression' from source: unknown 22225 1726882753.51656: ANSIBALLZ: Using generic lock for ansible.legacy.command 22225 1726882753.51660: ANSIBALLZ: Acquiring lock 22225 1726882753.51663: ANSIBALLZ: Lock acquired: 140272895053888 22225 1726882753.51668: ANSIBALLZ: Creating module 22225 1726882753.66164: ANSIBALLZ: Writing module into payload 22225 1726882753.66366: ANSIBALLZ: Writing module 22225 1726882753.66370: ANSIBALLZ: Renaming module 22225 1726882753.66373: ANSIBALLZ: Done creating module 22225 1726882753.66441: variable 'ansible_facts' from source: unknown 22225 1726882753.66507: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882753.472472-22533-69801173746487/AnsiballZ_command.py 22225 1726882753.66618: Sending initial data 22225 1726882753.66624: Sent initial data (154 bytes) 22225 1726882753.67105: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882753.67109: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found <<< 22225 1726882753.67111: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 22225 1726882753.67114: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882753.67117: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found <<< 22225 1726882753.67119: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882753.67170: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882753.67178: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882753.67180: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882753.67236: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882753.69160: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22225 1726882753.69237: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22225 1726882753.69377: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2222502xecypi/tmpnq7phddc /root/.ansible/tmp/ansible-tmp-1726882753.472472-22533-69801173746487/AnsiballZ_command.py <<< 22225 1726882753.69381: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882753.472472-22533-69801173746487/AnsiballZ_command.py" <<< 22225 1726882753.69424: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-2222502xecypi/tmpnq7phddc" to remote "/root/.ansible/tmp/ansible-tmp-1726882753.472472-22533-69801173746487/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882753.472472-22533-69801173746487/AnsiballZ_command.py" <<< 22225 1726882753.70249: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882753.70326: stderr chunk (state=3): >>><<< 22225 1726882753.70336: stdout chunk (state=3): >>><<< 22225 1726882753.70353: done transferring module to remote 22225 1726882753.70364: _low_level_execute_command(): starting 22225 1726882753.70371: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882753.472472-22533-69801173746487/ /root/.ansible/tmp/ansible-tmp-1726882753.472472-22533-69801173746487/AnsiballZ_command.py && sleep 0' 22225 1726882753.70786: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882753.70790: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882753.70825: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882753.70828: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882753.70835: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882753.70837: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882753.70879: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882753.70900: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882753.70952: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882753.73031: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882753.73036: stdout chunk (state=3): >>><<< 22225 1726882753.73039: stderr chunk (state=3): >>><<< 22225 1726882753.73041: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882753.73044: _low_level_execute_command(): starting 22225 1726882753.73046: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882753.472472-22533-69801173746487/AnsiballZ_command.py && sleep 0' 22225 1726882753.73710: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22225 1726882753.73739: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882753.73838: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882753.73850: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882753.73865: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882753.73879: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882753.73973: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882753.91086: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:39:13.905477", "end": "2024-09-20 21:39:13.909101", "delta": "0:00:00.003624", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 22225 1726882753.92765: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. <<< 22225 1726882753.92819: stderr chunk (state=3): >>><<< 22225 1726882753.92825: stdout chunk (state=3): >>><<< 22225 1726882753.92843: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:39:13.905477", "end": "2024-09-20 21:39:13.909101", "delta": "0:00:00.003624", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. 22225 1726882753.92874: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882753.472472-22533-69801173746487/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22225 1726882753.92884: _low_level_execute_command(): starting 22225 1726882753.92887: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882753.472472-22533-69801173746487/ > /dev/null 2>&1 && sleep 0' 22225 1726882753.93305: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882753.93339: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882753.93343: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882753.93353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882753.93400: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882753.93408: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882753.93459: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882753.95424: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882753.95467: stderr chunk (state=3): >>><<< 22225 1726882753.95470: stdout chunk (state=3): >>><<< 22225 1726882753.95484: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882753.95491: handler run complete 22225 1726882753.95511: Evaluated conditional (False): False 22225 1726882753.95520: attempt loop complete, returning result 22225 1726882753.95524: _execute() done 22225 1726882753.95527: dumping result to json 22225 1726882753.95532: done dumping result, returning 22225 1726882753.95540: done running TaskExecutor() for managed_node1/TASK: Gather current interface info [0affc7ec-ae25-ec05-55b7-00000000013a] 22225 1726882753.95545: sending task result for task 0affc7ec-ae25-ec05-55b7-00000000013a 22225 1726882753.95651: done sending task result for task 0affc7ec-ae25-ec05-55b7-00000000013a 22225 1726882753.95653: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003624", "end": "2024-09-20 21:39:13.909101", "rc": 0, "start": "2024-09-20 21:39:13.905477" } STDOUT: bonding_masters eth0 lo 22225 1726882753.95732: no more pending results, returning what we have 22225 1726882753.95735: results queue empty 22225 1726882753.95736: checking for any_errors_fatal 22225 1726882753.95738: done checking for any_errors_fatal 22225 1726882753.95738: checking for max_fail_percentage 22225 1726882753.95740: done checking for max_fail_percentage 22225 1726882753.95741: checking to see if all hosts have failed and the running result is not ok 22225 1726882753.95742: done checking to see if all hosts have failed 22225 1726882753.95742: getting the remaining hosts for this loop 22225 1726882753.95744: done getting the remaining hosts for this loop 22225 1726882753.95748: getting the next task for host managed_node1 22225 1726882753.95755: done getting next task for host managed_node1 22225 1726882753.95757: ^ task is: TASK: Set current_interfaces 22225 1726882753.95761: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882753.95764: getting variables 22225 1726882753.95766: in VariableManager get_vars() 22225 1726882753.95806: Calling all_inventory to load vars for managed_node1 22225 1726882753.95809: Calling groups_inventory to load vars for managed_node1 22225 1726882753.95811: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882753.95829: Calling all_plugins_play to load vars for managed_node1 22225 1726882753.95833: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882753.95836: Calling groups_plugins_play to load vars for managed_node1 22225 1726882753.95999: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882753.96169: done with get_vars() 22225 1726882753.96177: done getting variables 22225 1726882753.96227: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 21:39:13 -0400 (0:00:00.542) 0:00:09.355 ****** 22225 1726882753.96248: entering _queue_task() for managed_node1/set_fact 22225 1726882753.96461: worker is 1 (out of 1 available) 22225 1726882753.96473: exiting _queue_task() for managed_node1/set_fact 22225 1726882753.96484: done queuing things up, now waiting for results queue to drain 22225 1726882753.96486: waiting for pending results... 22225 1726882753.96643: running TaskExecutor() for managed_node1/TASK: Set current_interfaces 22225 1726882753.96719: in run() - task 0affc7ec-ae25-ec05-55b7-00000000013b 22225 1726882753.96728: variable 'ansible_search_path' from source: unknown 22225 1726882753.96734: variable 'ansible_search_path' from source: unknown 22225 1726882753.96761: calling self._execute() 22225 1726882753.96831: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882753.96835: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882753.96845: variable 'omit' from source: magic vars 22225 1726882753.97126: variable 'ansible_distribution_major_version' from source: facts 22225 1726882753.97137: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882753.97144: variable 'omit' from source: magic vars 22225 1726882753.97180: variable 'omit' from source: magic vars 22225 1726882753.97262: variable '_current_interfaces' from source: set_fact 22225 1726882753.97308: variable 'omit' from source: magic vars 22225 1726882753.97340: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22225 1726882753.97372: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22225 1726882753.97388: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22225 1726882753.97402: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882753.97412: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882753.97437: variable 'inventory_hostname' from source: host vars for 'managed_node1' 22225 1726882753.97441: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882753.97443: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882753.97518: Set connection var ansible_connection to ssh 22225 1726882753.97528: Set connection var ansible_pipelining to False 22225 1726882753.97535: Set connection var ansible_shell_executable to /bin/sh 22225 1726882753.97541: Set connection var ansible_timeout to 10 22225 1726882753.97544: Set connection var ansible_shell_type to sh 22225 1726882753.97549: Set connection var ansible_module_compression to ZIP_DEFLATED 22225 1726882753.97568: variable 'ansible_shell_executable' from source: unknown 22225 1726882753.97571: variable 'ansible_connection' from source: unknown 22225 1726882753.97574: variable 'ansible_module_compression' from source: unknown 22225 1726882753.97576: variable 'ansible_shell_type' from source: unknown 22225 1726882753.97578: variable 'ansible_shell_executable' from source: unknown 22225 1726882753.97585: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882753.97593: variable 'ansible_pipelining' from source: unknown 22225 1726882753.97596: variable 'ansible_timeout' from source: unknown 22225 1726882753.97602: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882753.97713: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22225 1726882753.97721: variable 'omit' from source: magic vars 22225 1726882753.97731: starting attempt loop 22225 1726882753.97734: running the handler 22225 1726882753.97743: handler run complete 22225 1726882753.97751: attempt loop complete, returning result 22225 1726882753.97754: _execute() done 22225 1726882753.97757: dumping result to json 22225 1726882753.97759: done dumping result, returning 22225 1726882753.97766: done running TaskExecutor() for managed_node1/TASK: Set current_interfaces [0affc7ec-ae25-ec05-55b7-00000000013b] 22225 1726882753.97771: sending task result for task 0affc7ec-ae25-ec05-55b7-00000000013b 22225 1726882753.97855: done sending task result for task 0affc7ec-ae25-ec05-55b7-00000000013b 22225 1726882753.97859: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 22225 1726882753.97923: no more pending results, returning what we have 22225 1726882753.97926: results queue empty 22225 1726882753.97927: checking for any_errors_fatal 22225 1726882753.97933: done checking for any_errors_fatal 22225 1726882753.97934: checking for max_fail_percentage 22225 1726882753.97936: done checking for max_fail_percentage 22225 1726882753.97936: checking to see if all hosts have failed and the running result is not ok 22225 1726882753.97937: done checking to see if all hosts have failed 22225 1726882753.97938: getting the remaining hosts for this loop 22225 1726882753.97939: done getting the remaining hosts for this loop 22225 1726882753.97942: getting the next task for host managed_node1 22225 1726882753.97949: done getting next task for host managed_node1 22225 1726882753.97951: ^ task is: TASK: Show current_interfaces 22225 1726882753.97954: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882753.97957: getting variables 22225 1726882753.97958: in VariableManager get_vars() 22225 1726882753.97995: Calling all_inventory to load vars for managed_node1 22225 1726882753.97998: Calling groups_inventory to load vars for managed_node1 22225 1726882753.98000: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882753.98007: Calling all_plugins_play to load vars for managed_node1 22225 1726882753.98009: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882753.98011: Calling groups_plugins_play to load vars for managed_node1 22225 1726882753.98132: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882753.98268: done with get_vars() 22225 1726882753.98275: done getting variables 22225 1726882753.98346: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 21:39:13 -0400 (0:00:00.021) 0:00:09.376 ****** 22225 1726882753.98365: entering _queue_task() for managed_node1/debug 22225 1726882753.98367: Creating lock for debug 22225 1726882753.98557: worker is 1 (out of 1 available) 22225 1726882753.98571: exiting _queue_task() for managed_node1/debug 22225 1726882753.98584: done queuing things up, now waiting for results queue to drain 22225 1726882753.98585: waiting for pending results... 22225 1726882753.98728: running TaskExecutor() for managed_node1/TASK: Show current_interfaces 22225 1726882753.98790: in run() - task 0affc7ec-ae25-ec05-55b7-00000000012c 22225 1726882753.98801: variable 'ansible_search_path' from source: unknown 22225 1726882753.98804: variable 'ansible_search_path' from source: unknown 22225 1726882753.98837: calling self._execute() 22225 1726882753.98897: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882753.98902: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882753.98910: variable 'omit' from source: magic vars 22225 1726882753.99170: variable 'ansible_distribution_major_version' from source: facts 22225 1726882753.99182: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882753.99186: variable 'omit' from source: magic vars 22225 1726882753.99211: variable 'omit' from source: magic vars 22225 1726882753.99285: variable 'current_interfaces' from source: set_fact 22225 1726882753.99302: variable 'omit' from source: magic vars 22225 1726882753.99334: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22225 1726882753.99363: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22225 1726882753.99378: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22225 1726882753.99392: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882753.99401: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882753.99425: variable 'inventory_hostname' from source: host vars for 'managed_node1' 22225 1726882753.99429: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882753.99434: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882753.99506: Set connection var ansible_connection to ssh 22225 1726882753.99515: Set connection var ansible_pipelining to False 22225 1726882753.99524: Set connection var ansible_shell_executable to /bin/sh 22225 1726882753.99529: Set connection var ansible_timeout to 10 22225 1726882753.99532: Set connection var ansible_shell_type to sh 22225 1726882753.99538: Set connection var ansible_module_compression to ZIP_DEFLATED 22225 1726882753.99556: variable 'ansible_shell_executable' from source: unknown 22225 1726882753.99559: variable 'ansible_connection' from source: unknown 22225 1726882753.99562: variable 'ansible_module_compression' from source: unknown 22225 1726882753.99564: variable 'ansible_shell_type' from source: unknown 22225 1726882753.99566: variable 'ansible_shell_executable' from source: unknown 22225 1726882753.99570: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882753.99573: variable 'ansible_pipelining' from source: unknown 22225 1726882753.99575: variable 'ansible_timeout' from source: unknown 22225 1726882753.99589: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882753.99686: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22225 1726882753.99690: variable 'omit' from source: magic vars 22225 1726882753.99703: starting attempt loop 22225 1726882753.99706: running the handler 22225 1726882753.99738: handler run complete 22225 1726882753.99749: attempt loop complete, returning result 22225 1726882753.99752: _execute() done 22225 1726882753.99755: dumping result to json 22225 1726882753.99757: done dumping result, returning 22225 1726882753.99764: done running TaskExecutor() for managed_node1/TASK: Show current_interfaces [0affc7ec-ae25-ec05-55b7-00000000012c] 22225 1726882753.99768: sending task result for task 0affc7ec-ae25-ec05-55b7-00000000012c 22225 1726882753.99857: done sending task result for task 0affc7ec-ae25-ec05-55b7-00000000012c 22225 1726882753.99859: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 22225 1726882753.99906: no more pending results, returning what we have 22225 1726882753.99909: results queue empty 22225 1726882753.99910: checking for any_errors_fatal 22225 1726882753.99913: done checking for any_errors_fatal 22225 1726882753.99914: checking for max_fail_percentage 22225 1726882753.99916: done checking for max_fail_percentage 22225 1726882753.99916: checking to see if all hosts have failed and the running result is not ok 22225 1726882753.99917: done checking to see if all hosts have failed 22225 1726882753.99918: getting the remaining hosts for this loop 22225 1726882753.99919: done getting the remaining hosts for this loop 22225 1726882753.99924: getting the next task for host managed_node1 22225 1726882753.99930: done getting next task for host managed_node1 22225 1726882753.99933: ^ task is: TASK: Include the task 'manage_test_interface.yml' 22225 1726882753.99935: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882753.99938: getting variables 22225 1726882753.99939: in VariableManager get_vars() 22225 1726882753.99970: Calling all_inventory to load vars for managed_node1 22225 1726882753.99977: Calling groups_inventory to load vars for managed_node1 22225 1726882753.99978: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882753.99988: Calling all_plugins_play to load vars for managed_node1 22225 1726882753.99990: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882753.99992: Calling groups_plugins_play to load vars for managed_node1 22225 1726882754.00145: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882754.00280: done with get_vars() 22225 1726882754.00288: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:11 Friday 20 September 2024 21:39:14 -0400 (0:00:00.019) 0:00:09.396 ****** 22225 1726882754.00351: entering _queue_task() for managed_node1/include_tasks 22225 1726882754.00518: worker is 1 (out of 1 available) 22225 1726882754.00533: exiting _queue_task() for managed_node1/include_tasks 22225 1726882754.00544: done queuing things up, now waiting for results queue to drain 22225 1726882754.00545: waiting for pending results... 22225 1726882754.00703: running TaskExecutor() for managed_node1/TASK: Include the task 'manage_test_interface.yml' 22225 1726882754.00766: in run() - task 0affc7ec-ae25-ec05-55b7-00000000000c 22225 1726882754.00784: variable 'ansible_search_path' from source: unknown 22225 1726882754.00812: calling self._execute() 22225 1726882754.00880: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882754.00890: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882754.00899: variable 'omit' from source: magic vars 22225 1726882754.01174: variable 'ansible_distribution_major_version' from source: facts 22225 1726882754.01186: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882754.01190: _execute() done 22225 1726882754.01195: dumping result to json 22225 1726882754.01198: done dumping result, returning 22225 1726882754.01205: done running TaskExecutor() for managed_node1/TASK: Include the task 'manage_test_interface.yml' [0affc7ec-ae25-ec05-55b7-00000000000c] 22225 1726882754.01215: sending task result for task 0affc7ec-ae25-ec05-55b7-00000000000c 22225 1726882754.01303: done sending task result for task 0affc7ec-ae25-ec05-55b7-00000000000c 22225 1726882754.01306: WORKER PROCESS EXITING 22225 1726882754.01342: no more pending results, returning what we have 22225 1726882754.01347: in VariableManager get_vars() 22225 1726882754.01382: Calling all_inventory to load vars for managed_node1 22225 1726882754.01385: Calling groups_inventory to load vars for managed_node1 22225 1726882754.01387: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882754.01396: Calling all_plugins_play to load vars for managed_node1 22225 1726882754.01399: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882754.01402: Calling groups_plugins_play to load vars for managed_node1 22225 1726882754.01530: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882754.01662: done with get_vars() 22225 1726882754.01668: variable 'ansible_search_path' from source: unknown 22225 1726882754.01676: we have included files to process 22225 1726882754.01677: generating all_blocks data 22225 1726882754.01678: done generating all_blocks data 22225 1726882754.01683: processing included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 22225 1726882754.01684: loading included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 22225 1726882754.01686: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 22225 1726882754.02044: in VariableManager get_vars() 22225 1726882754.02058: done with get_vars() 22225 1726882754.02210: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 22225 1726882754.02604: done processing included file 22225 1726882754.02605: iterating over new_blocks loaded from include file 22225 1726882754.02606: in VariableManager get_vars() 22225 1726882754.02619: done with get_vars() 22225 1726882754.02621: filtering new block on tags 22225 1726882754.02645: done filtering new block on tags 22225 1726882754.02646: done iterating over new_blocks loaded from include file included: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed_node1 22225 1726882754.02790: extending task lists for all hosts with included blocks 22225 1726882754.02898: done extending task lists 22225 1726882754.02899: done processing included files 22225 1726882754.02900: results queue empty 22225 1726882754.02900: checking for any_errors_fatal 22225 1726882754.02902: done checking for any_errors_fatal 22225 1726882754.02902: checking for max_fail_percentage 22225 1726882754.02903: done checking for max_fail_percentage 22225 1726882754.02903: checking to see if all hosts have failed and the running result is not ok 22225 1726882754.02904: done checking to see if all hosts have failed 22225 1726882754.02905: getting the remaining hosts for this loop 22225 1726882754.02905: done getting the remaining hosts for this loop 22225 1726882754.02907: getting the next task for host managed_node1 22225 1726882754.02910: done getting next task for host managed_node1 22225 1726882754.02911: ^ task is: TASK: Ensure state in ["present", "absent"] 22225 1726882754.02913: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882754.02914: getting variables 22225 1726882754.02915: in VariableManager get_vars() 22225 1726882754.02925: Calling all_inventory to load vars for managed_node1 22225 1726882754.02926: Calling groups_inventory to load vars for managed_node1 22225 1726882754.02928: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882754.02932: Calling all_plugins_play to load vars for managed_node1 22225 1726882754.02933: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882754.02935: Calling groups_plugins_play to load vars for managed_node1 22225 1726882754.03030: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882754.03164: done with get_vars() 22225 1726882754.03171: done getting variables 22225 1726882754.03216: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Friday 20 September 2024 21:39:14 -0400 (0:00:00.028) 0:00:09.425 ****** 22225 1726882754.03238: entering _queue_task() for managed_node1/fail 22225 1726882754.03239: Creating lock for fail 22225 1726882754.03448: worker is 1 (out of 1 available) 22225 1726882754.03461: exiting _queue_task() for managed_node1/fail 22225 1726882754.03472: done queuing things up, now waiting for results queue to drain 22225 1726882754.03474: waiting for pending results... 22225 1726882754.03634: running TaskExecutor() for managed_node1/TASK: Ensure state in ["present", "absent"] 22225 1726882754.03698: in run() - task 0affc7ec-ae25-ec05-55b7-000000000156 22225 1726882754.03712: variable 'ansible_search_path' from source: unknown 22225 1726882754.03717: variable 'ansible_search_path' from source: unknown 22225 1726882754.03745: calling self._execute() 22225 1726882754.03815: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882754.03819: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882754.03831: variable 'omit' from source: magic vars 22225 1726882754.04118: variable 'ansible_distribution_major_version' from source: facts 22225 1726882754.04129: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882754.04227: variable 'state' from source: include params 22225 1726882754.04232: Evaluated conditional (state not in ["present", "absent"]): False 22225 1726882754.04235: when evaluation is False, skipping this task 22225 1726882754.04238: _execute() done 22225 1726882754.04240: dumping result to json 22225 1726882754.04245: done dumping result, returning 22225 1726882754.04251: done running TaskExecutor() for managed_node1/TASK: Ensure state in ["present", "absent"] [0affc7ec-ae25-ec05-55b7-000000000156] 22225 1726882754.04262: sending task result for task 0affc7ec-ae25-ec05-55b7-000000000156 22225 1726882754.04350: done sending task result for task 0affc7ec-ae25-ec05-55b7-000000000156 22225 1726882754.04353: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 22225 1726882754.04409: no more pending results, returning what we have 22225 1726882754.04413: results queue empty 22225 1726882754.04414: checking for any_errors_fatal 22225 1726882754.04415: done checking for any_errors_fatal 22225 1726882754.04416: checking for max_fail_percentage 22225 1726882754.04418: done checking for max_fail_percentage 22225 1726882754.04419: checking to see if all hosts have failed and the running result is not ok 22225 1726882754.04419: done checking to see if all hosts have failed 22225 1726882754.04420: getting the remaining hosts for this loop 22225 1726882754.04423: done getting the remaining hosts for this loop 22225 1726882754.04427: getting the next task for host managed_node1 22225 1726882754.04431: done getting next task for host managed_node1 22225 1726882754.04434: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 22225 1726882754.04437: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882754.04440: getting variables 22225 1726882754.04441: in VariableManager get_vars() 22225 1726882754.04475: Calling all_inventory to load vars for managed_node1 22225 1726882754.04478: Calling groups_inventory to load vars for managed_node1 22225 1726882754.04480: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882754.04489: Calling all_plugins_play to load vars for managed_node1 22225 1726882754.04491: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882754.04494: Calling groups_plugins_play to load vars for managed_node1 22225 1726882754.04640: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882754.04779: done with get_vars() 22225 1726882754.04786: done getting variables 22225 1726882754.04827: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Friday 20 September 2024 21:39:14 -0400 (0:00:00.016) 0:00:09.441 ****** 22225 1726882754.04848: entering _queue_task() for managed_node1/fail 22225 1726882754.05032: worker is 1 (out of 1 available) 22225 1726882754.05047: exiting _queue_task() for managed_node1/fail 22225 1726882754.05056: done queuing things up, now waiting for results queue to drain 22225 1726882754.05058: waiting for pending results... 22225 1726882754.05207: running TaskExecutor() for managed_node1/TASK: Ensure type in ["dummy", "tap", "veth"] 22225 1726882754.05268: in run() - task 0affc7ec-ae25-ec05-55b7-000000000157 22225 1726882754.05280: variable 'ansible_search_path' from source: unknown 22225 1726882754.05284: variable 'ansible_search_path' from source: unknown 22225 1726882754.05314: calling self._execute() 22225 1726882754.05378: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882754.05387: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882754.05397: variable 'omit' from source: magic vars 22225 1726882754.05668: variable 'ansible_distribution_major_version' from source: facts 22225 1726882754.05678: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882754.05781: variable 'type' from source: play vars 22225 1726882754.05789: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 22225 1726882754.05792: when evaluation is False, skipping this task 22225 1726882754.05795: _execute() done 22225 1726882754.05797: dumping result to json 22225 1726882754.05799: done dumping result, returning 22225 1726882754.05806: done running TaskExecutor() for managed_node1/TASK: Ensure type in ["dummy", "tap", "veth"] [0affc7ec-ae25-ec05-55b7-000000000157] 22225 1726882754.05811: sending task result for task 0affc7ec-ae25-ec05-55b7-000000000157 22225 1726882754.05898: done sending task result for task 0affc7ec-ae25-ec05-55b7-000000000157 22225 1726882754.05902: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 22225 1726882754.05949: no more pending results, returning what we have 22225 1726882754.05952: results queue empty 22225 1726882754.05953: checking for any_errors_fatal 22225 1726882754.05958: done checking for any_errors_fatal 22225 1726882754.05959: checking for max_fail_percentage 22225 1726882754.05960: done checking for max_fail_percentage 22225 1726882754.05961: checking to see if all hosts have failed and the running result is not ok 22225 1726882754.05962: done checking to see if all hosts have failed 22225 1726882754.05963: getting the remaining hosts for this loop 22225 1726882754.05964: done getting the remaining hosts for this loop 22225 1726882754.05967: getting the next task for host managed_node1 22225 1726882754.05973: done getting next task for host managed_node1 22225 1726882754.05975: ^ task is: TASK: Include the task 'show_interfaces.yml' 22225 1726882754.05978: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882754.05980: getting variables 22225 1726882754.05982: in VariableManager get_vars() 22225 1726882754.06013: Calling all_inventory to load vars for managed_node1 22225 1726882754.06020: Calling groups_inventory to load vars for managed_node1 22225 1726882754.06024: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882754.06031: Calling all_plugins_play to load vars for managed_node1 22225 1726882754.06034: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882754.06036: Calling groups_plugins_play to load vars for managed_node1 22225 1726882754.06156: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882754.06292: done with get_vars() 22225 1726882754.06299: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Friday 20 September 2024 21:39:14 -0400 (0:00:00.015) 0:00:09.456 ****** 22225 1726882754.06363: entering _queue_task() for managed_node1/include_tasks 22225 1726882754.06537: worker is 1 (out of 1 available) 22225 1726882754.06552: exiting _queue_task() for managed_node1/include_tasks 22225 1726882754.06562: done queuing things up, now waiting for results queue to drain 22225 1726882754.06564: waiting for pending results... 22225 1726882754.06698: running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' 22225 1726882754.06760: in run() - task 0affc7ec-ae25-ec05-55b7-000000000158 22225 1726882754.06768: variable 'ansible_search_path' from source: unknown 22225 1726882754.06771: variable 'ansible_search_path' from source: unknown 22225 1726882754.06805: calling self._execute() 22225 1726882754.06865: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882754.06869: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882754.06879: variable 'omit' from source: magic vars 22225 1726882754.07190: variable 'ansible_distribution_major_version' from source: facts 22225 1726882754.07200: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882754.07205: _execute() done 22225 1726882754.07208: dumping result to json 22225 1726882754.07210: done dumping result, returning 22225 1726882754.07217: done running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' [0affc7ec-ae25-ec05-55b7-000000000158] 22225 1726882754.07224: sending task result for task 0affc7ec-ae25-ec05-55b7-000000000158 22225 1726882754.07312: done sending task result for task 0affc7ec-ae25-ec05-55b7-000000000158 22225 1726882754.07314: WORKER PROCESS EXITING 22225 1726882754.07356: no more pending results, returning what we have 22225 1726882754.07360: in VariableManager get_vars() 22225 1726882754.07396: Calling all_inventory to load vars for managed_node1 22225 1726882754.07399: Calling groups_inventory to load vars for managed_node1 22225 1726882754.07401: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882754.07410: Calling all_plugins_play to load vars for managed_node1 22225 1726882754.07413: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882754.07416: Calling groups_plugins_play to load vars for managed_node1 22225 1726882754.07564: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882754.07697: done with get_vars() 22225 1726882754.07703: variable 'ansible_search_path' from source: unknown 22225 1726882754.07703: variable 'ansible_search_path' from source: unknown 22225 1726882754.07729: we have included files to process 22225 1726882754.07730: generating all_blocks data 22225 1726882754.07731: done generating all_blocks data 22225 1726882754.07734: processing included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 22225 1726882754.07735: loading included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 22225 1726882754.07737: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 22225 1726882754.07803: in VariableManager get_vars() 22225 1726882754.07818: done with get_vars() 22225 1726882754.07895: done processing included file 22225 1726882754.07897: iterating over new_blocks loaded from include file 22225 1726882754.07898: in VariableManager get_vars() 22225 1726882754.07910: done with get_vars() 22225 1726882754.07911: filtering new block on tags 22225 1726882754.07924: done filtering new block on tags 22225 1726882754.07925: done iterating over new_blocks loaded from include file included: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node1 22225 1726882754.07928: extending task lists for all hosts with included blocks 22225 1726882754.08184: done extending task lists 22225 1726882754.08186: done processing included files 22225 1726882754.08186: results queue empty 22225 1726882754.08187: checking for any_errors_fatal 22225 1726882754.08188: done checking for any_errors_fatal 22225 1726882754.08189: checking for max_fail_percentage 22225 1726882754.08190: done checking for max_fail_percentage 22225 1726882754.08191: checking to see if all hosts have failed and the running result is not ok 22225 1726882754.08192: done checking to see if all hosts have failed 22225 1726882754.08192: getting the remaining hosts for this loop 22225 1726882754.08193: done getting the remaining hosts for this loop 22225 1726882754.08195: getting the next task for host managed_node1 22225 1726882754.08197: done getting next task for host managed_node1 22225 1726882754.08199: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 22225 1726882754.08201: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882754.08202: getting variables 22225 1726882754.08203: in VariableManager get_vars() 22225 1726882754.08212: Calling all_inventory to load vars for managed_node1 22225 1726882754.08213: Calling groups_inventory to load vars for managed_node1 22225 1726882754.08214: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882754.08218: Calling all_plugins_play to load vars for managed_node1 22225 1726882754.08219: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882754.08221: Calling groups_plugins_play to load vars for managed_node1 22225 1726882754.08334: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882754.08465: done with get_vars() 22225 1726882754.08471: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 21:39:14 -0400 (0:00:00.021) 0:00:09.477 ****** 22225 1726882754.08521: entering _queue_task() for managed_node1/include_tasks 22225 1726882754.08694: worker is 1 (out of 1 available) 22225 1726882754.08707: exiting _queue_task() for managed_node1/include_tasks 22225 1726882754.08718: done queuing things up, now waiting for results queue to drain 22225 1726882754.08719: waiting for pending results... 22225 1726882754.08859: running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' 22225 1726882754.08925: in run() - task 0affc7ec-ae25-ec05-55b7-00000000017f 22225 1726882754.08937: variable 'ansible_search_path' from source: unknown 22225 1726882754.08940: variable 'ansible_search_path' from source: unknown 22225 1726882754.08968: calling self._execute() 22225 1726882754.09033: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882754.09038: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882754.09047: variable 'omit' from source: magic vars 22225 1726882754.09315: variable 'ansible_distribution_major_version' from source: facts 22225 1726882754.09325: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882754.09332: _execute() done 22225 1726882754.09335: dumping result to json 22225 1726882754.09337: done dumping result, returning 22225 1726882754.09344: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' [0affc7ec-ae25-ec05-55b7-00000000017f] 22225 1726882754.09349: sending task result for task 0affc7ec-ae25-ec05-55b7-00000000017f 22225 1726882754.09437: done sending task result for task 0affc7ec-ae25-ec05-55b7-00000000017f 22225 1726882754.09440: WORKER PROCESS EXITING 22225 1726882754.09468: no more pending results, returning what we have 22225 1726882754.09473: in VariableManager get_vars() 22225 1726882754.09509: Calling all_inventory to load vars for managed_node1 22225 1726882754.09512: Calling groups_inventory to load vars for managed_node1 22225 1726882754.09514: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882754.09529: Calling all_plugins_play to load vars for managed_node1 22225 1726882754.09532: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882754.09535: Calling groups_plugins_play to load vars for managed_node1 22225 1726882754.09665: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882754.09800: done with get_vars() 22225 1726882754.09806: variable 'ansible_search_path' from source: unknown 22225 1726882754.09806: variable 'ansible_search_path' from source: unknown 22225 1726882754.09847: we have included files to process 22225 1726882754.09848: generating all_blocks data 22225 1726882754.09849: done generating all_blocks data 22225 1726882754.09850: processing included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 22225 1726882754.09851: loading included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 22225 1726882754.09854: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 22225 1726882754.10036: done processing included file 22225 1726882754.10038: iterating over new_blocks loaded from include file 22225 1726882754.10039: in VariableManager get_vars() 22225 1726882754.10051: done with get_vars() 22225 1726882754.10052: filtering new block on tags 22225 1726882754.10064: done filtering new block on tags 22225 1726882754.10065: done iterating over new_blocks loaded from include file included: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node1 22225 1726882754.10069: extending task lists for all hosts with included blocks 22225 1726882754.10167: done extending task lists 22225 1726882754.10168: done processing included files 22225 1726882754.10168: results queue empty 22225 1726882754.10169: checking for any_errors_fatal 22225 1726882754.10171: done checking for any_errors_fatal 22225 1726882754.10171: checking for max_fail_percentage 22225 1726882754.10172: done checking for max_fail_percentage 22225 1726882754.10173: checking to see if all hosts have failed and the running result is not ok 22225 1726882754.10173: done checking to see if all hosts have failed 22225 1726882754.10174: getting the remaining hosts for this loop 22225 1726882754.10174: done getting the remaining hosts for this loop 22225 1726882754.10176: getting the next task for host managed_node1 22225 1726882754.10181: done getting next task for host managed_node1 22225 1726882754.10182: ^ task is: TASK: Gather current interface info 22225 1726882754.10185: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882754.10186: getting variables 22225 1726882754.10187: in VariableManager get_vars() 22225 1726882754.10196: Calling all_inventory to load vars for managed_node1 22225 1726882754.10197: Calling groups_inventory to load vars for managed_node1 22225 1726882754.10198: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882754.10202: Calling all_plugins_play to load vars for managed_node1 22225 1726882754.10205: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882754.10206: Calling groups_plugins_play to load vars for managed_node1 22225 1726882754.10321: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882754.10453: done with get_vars() 22225 1726882754.10460: done getting variables 22225 1726882754.10487: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 21:39:14 -0400 (0:00:00.019) 0:00:09.497 ****** 22225 1726882754.10509: entering _queue_task() for managed_node1/command 22225 1726882754.10689: worker is 1 (out of 1 available) 22225 1726882754.10703: exiting _queue_task() for managed_node1/command 22225 1726882754.10713: done queuing things up, now waiting for results queue to drain 22225 1726882754.10714: waiting for pending results... 22225 1726882754.10864: running TaskExecutor() for managed_node1/TASK: Gather current interface info 22225 1726882754.10936: in run() - task 0affc7ec-ae25-ec05-55b7-0000000001b6 22225 1726882754.10952: variable 'ansible_search_path' from source: unknown 22225 1726882754.10956: variable 'ansible_search_path' from source: unknown 22225 1726882754.10980: calling self._execute() 22225 1726882754.11043: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882754.11049: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882754.11059: variable 'omit' from source: magic vars 22225 1726882754.11326: variable 'ansible_distribution_major_version' from source: facts 22225 1726882754.11336: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882754.11341: variable 'omit' from source: magic vars 22225 1726882754.11380: variable 'omit' from source: magic vars 22225 1726882754.11409: variable 'omit' from source: magic vars 22225 1726882754.11440: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22225 1726882754.11469: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22225 1726882754.11487: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22225 1726882754.11509: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882754.11512: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882754.11539: variable 'inventory_hostname' from source: host vars for 'managed_node1' 22225 1726882754.11542: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882754.11544: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882754.11616: Set connection var ansible_connection to ssh 22225 1726882754.11624: Set connection var ansible_pipelining to False 22225 1726882754.11634: Set connection var ansible_shell_executable to /bin/sh 22225 1726882754.11639: Set connection var ansible_timeout to 10 22225 1726882754.11642: Set connection var ansible_shell_type to sh 22225 1726882754.11648: Set connection var ansible_module_compression to ZIP_DEFLATED 22225 1726882754.11666: variable 'ansible_shell_executable' from source: unknown 22225 1726882754.11670: variable 'ansible_connection' from source: unknown 22225 1726882754.11672: variable 'ansible_module_compression' from source: unknown 22225 1726882754.11675: variable 'ansible_shell_type' from source: unknown 22225 1726882754.11677: variable 'ansible_shell_executable' from source: unknown 22225 1726882754.11680: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882754.11686: variable 'ansible_pipelining' from source: unknown 22225 1726882754.11689: variable 'ansible_timeout' from source: unknown 22225 1726882754.11693: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882754.11799: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22225 1726882754.11808: variable 'omit' from source: magic vars 22225 1726882754.11815: starting attempt loop 22225 1726882754.11818: running the handler 22225 1726882754.11833: _low_level_execute_command(): starting 22225 1726882754.11842: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22225 1726882754.12372: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882754.12376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882754.12379: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882754.12383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882754.12441: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882754.12444: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882754.12451: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882754.12511: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882754.14269: stdout chunk (state=3): >>>/root <<< 22225 1726882754.14387: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882754.14436: stderr chunk (state=3): >>><<< 22225 1726882754.14440: stdout chunk (state=3): >>><<< 22225 1726882754.14460: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882754.14471: _low_level_execute_command(): starting 22225 1726882754.14477: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882754.1445978-22570-142180023008120 `" && echo ansible-tmp-1726882754.1445978-22570-142180023008120="` echo /root/.ansible/tmp/ansible-tmp-1726882754.1445978-22570-142180023008120 `" ) && sleep 0' 22225 1726882754.14936: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882754.14939: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found <<< 22225 1726882754.14942: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 22225 1726882754.14950: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882754.14953: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882754.15008: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882754.15011: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882754.15013: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882754.15059: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882754.17043: stdout chunk (state=3): >>>ansible-tmp-1726882754.1445978-22570-142180023008120=/root/.ansible/tmp/ansible-tmp-1726882754.1445978-22570-142180023008120 <<< 22225 1726882754.17167: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882754.17206: stderr chunk (state=3): >>><<< 22225 1726882754.17210: stdout chunk (state=3): >>><<< 22225 1726882754.17225: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882754.1445978-22570-142180023008120=/root/.ansible/tmp/ansible-tmp-1726882754.1445978-22570-142180023008120 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882754.17251: variable 'ansible_module_compression' from source: unknown 22225 1726882754.17295: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-2222502xecypi/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 22225 1726882754.17326: variable 'ansible_facts' from source: unknown 22225 1726882754.17386: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882754.1445978-22570-142180023008120/AnsiballZ_command.py 22225 1726882754.17481: Sending initial data 22225 1726882754.17484: Sent initial data (156 bytes) 22225 1726882754.17917: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882754.17923: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration <<< 22225 1726882754.17926: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found <<< 22225 1726882754.17928: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882754.17975: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882754.17978: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882754.18034: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882754.19642: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22225 1726882754.19693: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22225 1726882754.19747: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2222502xecypi/tmp5azf6cwi /root/.ansible/tmp/ansible-tmp-1726882754.1445978-22570-142180023008120/AnsiballZ_command.py <<< 22225 1726882754.19750: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882754.1445978-22570-142180023008120/AnsiballZ_command.py" <<< 22225 1726882754.19799: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-2222502xecypi/tmp5azf6cwi" to remote "/root/.ansible/tmp/ansible-tmp-1726882754.1445978-22570-142180023008120/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882754.1445978-22570-142180023008120/AnsiballZ_command.py" <<< 22225 1726882754.20373: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882754.20426: stderr chunk (state=3): >>><<< 22225 1726882754.20430: stdout chunk (state=3): >>><<< 22225 1726882754.20449: done transferring module to remote 22225 1726882754.20459: _low_level_execute_command(): starting 22225 1726882754.20463: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882754.1445978-22570-142180023008120/ /root/.ansible/tmp/ansible-tmp-1726882754.1445978-22570-142180023008120/AnsiballZ_command.py && sleep 0' 22225 1726882754.20897: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882754.20900: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found <<< 22225 1726882754.20903: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882754.20905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882754.20910: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882754.20964: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882754.20967: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882754.21016: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882754.23027: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882754.23031: stdout chunk (state=3): >>><<< 22225 1726882754.23034: stderr chunk (state=3): >>><<< 22225 1726882754.23036: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882754.23039: _low_level_execute_command(): starting 22225 1726882754.23041: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882754.1445978-22570-142180023008120/AnsiballZ_command.py && sleep 0' 22225 1726882754.23724: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882754.23744: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882754.23756: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882754.23847: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882754.40975: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:39:14.404481", "end": "2024-09-20 21:39:14.408094", "delta": "0:00:00.003613", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 22225 1726882754.42767: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. <<< 22225 1726882754.42771: stdout chunk (state=3): >>><<< 22225 1726882754.42773: stderr chunk (state=3): >>><<< 22225 1726882754.42928: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:39:14.404481", "end": "2024-09-20 21:39:14.408094", "delta": "0:00:00.003613", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. 22225 1726882754.42933: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882754.1445978-22570-142180023008120/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22225 1726882754.42936: _low_level_execute_command(): starting 22225 1726882754.42938: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882754.1445978-22570-142180023008120/ > /dev/null 2>&1 && sleep 0' 22225 1726882754.43590: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22225 1726882754.43617: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882754.43636: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882754.43724: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882754.43776: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882754.43802: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882754.43891: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882754.45918: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882754.45924: stdout chunk (state=3): >>><<< 22225 1726882754.45927: stderr chunk (state=3): >>><<< 22225 1726882754.46132: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882754.46135: handler run complete 22225 1726882754.46138: Evaluated conditional (False): False 22225 1726882754.46140: attempt loop complete, returning result 22225 1726882754.46142: _execute() done 22225 1726882754.46145: dumping result to json 22225 1726882754.46147: done dumping result, returning 22225 1726882754.46149: done running TaskExecutor() for managed_node1/TASK: Gather current interface info [0affc7ec-ae25-ec05-55b7-0000000001b6] 22225 1726882754.46151: sending task result for task 0affc7ec-ae25-ec05-55b7-0000000001b6 22225 1726882754.46244: done sending task result for task 0affc7ec-ae25-ec05-55b7-0000000001b6 22225 1726882754.46248: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003613", "end": "2024-09-20 21:39:14.408094", "rc": 0, "start": "2024-09-20 21:39:14.404481" } STDOUT: bonding_masters eth0 lo 22225 1726882754.46346: no more pending results, returning what we have 22225 1726882754.46351: results queue empty 22225 1726882754.46352: checking for any_errors_fatal 22225 1726882754.46361: done checking for any_errors_fatal 22225 1726882754.46362: checking for max_fail_percentage 22225 1726882754.46364: done checking for max_fail_percentage 22225 1726882754.46365: checking to see if all hosts have failed and the running result is not ok 22225 1726882754.46366: done checking to see if all hosts have failed 22225 1726882754.46367: getting the remaining hosts for this loop 22225 1726882754.46368: done getting the remaining hosts for this loop 22225 1726882754.46373: getting the next task for host managed_node1 22225 1726882754.46383: done getting next task for host managed_node1 22225 1726882754.46386: ^ task is: TASK: Set current_interfaces 22225 1726882754.46391: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882754.46396: getting variables 22225 1726882754.46398: in VariableManager get_vars() 22225 1726882754.46572: Calling all_inventory to load vars for managed_node1 22225 1726882754.46576: Calling groups_inventory to load vars for managed_node1 22225 1726882754.46578: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882754.46594: Calling all_plugins_play to load vars for managed_node1 22225 1726882754.46597: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882754.46600: Calling groups_plugins_play to load vars for managed_node1 22225 1726882754.46945: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882754.47282: done with get_vars() 22225 1726882754.47294: done getting variables 22225 1726882754.47410: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 21:39:14 -0400 (0:00:00.369) 0:00:09.867 ****** 22225 1726882754.47453: entering _queue_task() for managed_node1/set_fact 22225 1726882754.47791: worker is 1 (out of 1 available) 22225 1726882754.47805: exiting _queue_task() for managed_node1/set_fact 22225 1726882754.47815: done queuing things up, now waiting for results queue to drain 22225 1726882754.47817: waiting for pending results... 22225 1726882754.48202: running TaskExecutor() for managed_node1/TASK: Set current_interfaces 22225 1726882754.48207: in run() - task 0affc7ec-ae25-ec05-55b7-0000000001b7 22225 1726882754.48211: variable 'ansible_search_path' from source: unknown 22225 1726882754.48218: variable 'ansible_search_path' from source: unknown 22225 1726882754.48261: calling self._execute() 22225 1726882754.48363: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882754.48376: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882754.48393: variable 'omit' from source: magic vars 22225 1726882754.48815: variable 'ansible_distribution_major_version' from source: facts 22225 1726882754.48842: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882754.48951: variable 'omit' from source: magic vars 22225 1726882754.48956: variable 'omit' from source: magic vars 22225 1726882754.49059: variable '_current_interfaces' from source: set_fact 22225 1726882754.49135: variable 'omit' from source: magic vars 22225 1726882754.49227: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22225 1726882754.49238: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22225 1726882754.49263: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22225 1726882754.49300: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882754.49316: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882754.49353: variable 'inventory_hostname' from source: host vars for 'managed_node1' 22225 1726882754.49361: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882754.49389: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882754.49503: Set connection var ansible_connection to ssh 22225 1726882754.49607: Set connection var ansible_pipelining to False 22225 1726882754.49611: Set connection var ansible_shell_executable to /bin/sh 22225 1726882754.49613: Set connection var ansible_timeout to 10 22225 1726882754.49615: Set connection var ansible_shell_type to sh 22225 1726882754.49618: Set connection var ansible_module_compression to ZIP_DEFLATED 22225 1726882754.49620: variable 'ansible_shell_executable' from source: unknown 22225 1726882754.49624: variable 'ansible_connection' from source: unknown 22225 1726882754.49627: variable 'ansible_module_compression' from source: unknown 22225 1726882754.49629: variable 'ansible_shell_type' from source: unknown 22225 1726882754.49631: variable 'ansible_shell_executable' from source: unknown 22225 1726882754.49634: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882754.49636: variable 'ansible_pipelining' from source: unknown 22225 1726882754.49638: variable 'ansible_timeout' from source: unknown 22225 1726882754.49640: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882754.49826: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22225 1726882754.49867: variable 'omit' from source: magic vars 22225 1726882754.49869: starting attempt loop 22225 1726882754.49872: running the handler 22225 1726882754.49875: handler run complete 22225 1726882754.49877: attempt loop complete, returning result 22225 1726882754.49881: _execute() done 22225 1726882754.49884: dumping result to json 22225 1726882754.49889: done dumping result, returning 22225 1726882754.49932: done running TaskExecutor() for managed_node1/TASK: Set current_interfaces [0affc7ec-ae25-ec05-55b7-0000000001b7] 22225 1726882754.49935: sending task result for task 0affc7ec-ae25-ec05-55b7-0000000001b7 ok: [managed_node1] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 22225 1726882754.50152: no more pending results, returning what we have 22225 1726882754.50156: results queue empty 22225 1726882754.50157: checking for any_errors_fatal 22225 1726882754.50166: done checking for any_errors_fatal 22225 1726882754.50167: checking for max_fail_percentage 22225 1726882754.50171: done checking for max_fail_percentage 22225 1726882754.50172: checking to see if all hosts have failed and the running result is not ok 22225 1726882754.50173: done checking to see if all hosts have failed 22225 1726882754.50174: getting the remaining hosts for this loop 22225 1726882754.50175: done getting the remaining hosts for this loop 22225 1726882754.50181: getting the next task for host managed_node1 22225 1726882754.50191: done getting next task for host managed_node1 22225 1726882754.50196: ^ task is: TASK: Show current_interfaces 22225 1726882754.50200: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882754.50203: getting variables 22225 1726882754.50205: in VariableManager get_vars() 22225 1726882754.50350: Calling all_inventory to load vars for managed_node1 22225 1726882754.50354: Calling groups_inventory to load vars for managed_node1 22225 1726882754.50356: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882754.50366: Calling all_plugins_play to load vars for managed_node1 22225 1726882754.50369: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882754.50373: Calling groups_plugins_play to load vars for managed_node1 22225 1726882754.50686: done sending task result for task 0affc7ec-ae25-ec05-55b7-0000000001b7 22225 1726882754.50691: WORKER PROCESS EXITING 22225 1726882754.50715: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882754.51005: done with get_vars() 22225 1726882754.51015: done getting variables 22225 1726882754.51073: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 21:39:14 -0400 (0:00:00.036) 0:00:09.903 ****** 22225 1726882754.51115: entering _queue_task() for managed_node1/debug 22225 1726882754.51455: worker is 1 (out of 1 available) 22225 1726882754.51467: exiting _queue_task() for managed_node1/debug 22225 1726882754.51477: done queuing things up, now waiting for results queue to drain 22225 1726882754.51478: waiting for pending results... 22225 1726882754.51672: running TaskExecutor() for managed_node1/TASK: Show current_interfaces 22225 1726882754.51804: in run() - task 0affc7ec-ae25-ec05-55b7-000000000180 22225 1726882754.51834: variable 'ansible_search_path' from source: unknown 22225 1726882754.51844: variable 'ansible_search_path' from source: unknown 22225 1726882754.51897: calling self._execute() 22225 1726882754.51998: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882754.52010: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882754.52029: variable 'omit' from source: magic vars 22225 1726882754.52544: variable 'ansible_distribution_major_version' from source: facts 22225 1726882754.52548: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882754.52550: variable 'omit' from source: magic vars 22225 1726882754.52754: variable 'omit' from source: magic vars 22225 1726882754.52941: variable 'current_interfaces' from source: set_fact 22225 1726882754.53140: variable 'omit' from source: magic vars 22225 1726882754.53228: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22225 1726882754.53334: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22225 1726882754.53451: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22225 1726882754.53515: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882754.53832: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882754.53835: variable 'inventory_hostname' from source: host vars for 'managed_node1' 22225 1726882754.53838: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882754.53840: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882754.53899: Set connection var ansible_connection to ssh 22225 1726882754.53914: Set connection var ansible_pipelining to False 22225 1726882754.53930: Set connection var ansible_shell_executable to /bin/sh 22225 1726882754.53948: Set connection var ansible_timeout to 10 22225 1726882754.53956: Set connection var ansible_shell_type to sh 22225 1726882754.53965: Set connection var ansible_module_compression to ZIP_DEFLATED 22225 1726882754.53997: variable 'ansible_shell_executable' from source: unknown 22225 1726882754.54056: variable 'ansible_connection' from source: unknown 22225 1726882754.54064: variable 'ansible_module_compression' from source: unknown 22225 1726882754.54071: variable 'ansible_shell_type' from source: unknown 22225 1726882754.54077: variable 'ansible_shell_executable' from source: unknown 22225 1726882754.54227: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882754.54230: variable 'ansible_pipelining' from source: unknown 22225 1726882754.54233: variable 'ansible_timeout' from source: unknown 22225 1726882754.54235: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882754.54531: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22225 1726882754.54549: variable 'omit' from source: magic vars 22225 1726882754.54561: starting attempt loop 22225 1726882754.54569: running the handler 22225 1726882754.54629: handler run complete 22225 1726882754.54726: attempt loop complete, returning result 22225 1726882754.54735: _execute() done 22225 1726882754.54743: dumping result to json 22225 1726882754.54750: done dumping result, returning 22225 1726882754.54764: done running TaskExecutor() for managed_node1/TASK: Show current_interfaces [0affc7ec-ae25-ec05-55b7-000000000180] 22225 1726882754.54774: sending task result for task 0affc7ec-ae25-ec05-55b7-000000000180 ok: [managed_node1] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 22225 1726882754.54973: no more pending results, returning what we have 22225 1726882754.54977: results queue empty 22225 1726882754.54978: checking for any_errors_fatal 22225 1726882754.54986: done checking for any_errors_fatal 22225 1726882754.54987: checking for max_fail_percentage 22225 1726882754.54989: done checking for max_fail_percentage 22225 1726882754.54990: checking to see if all hosts have failed and the running result is not ok 22225 1726882754.54991: done checking to see if all hosts have failed 22225 1726882754.54992: getting the remaining hosts for this loop 22225 1726882754.54993: done getting the remaining hosts for this loop 22225 1726882754.54998: getting the next task for host managed_node1 22225 1726882754.55007: done getting next task for host managed_node1 22225 1726882754.55011: ^ task is: TASK: Install iproute 22225 1726882754.55014: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882754.55019: getting variables 22225 1726882754.55020: in VariableManager get_vars() 22225 1726882754.55066: Calling all_inventory to load vars for managed_node1 22225 1726882754.55069: Calling groups_inventory to load vars for managed_node1 22225 1726882754.55072: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882754.55088: Calling all_plugins_play to load vars for managed_node1 22225 1726882754.55091: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882754.55095: Calling groups_plugins_play to load vars for managed_node1 22225 1726882754.55931: done sending task result for task 0affc7ec-ae25-ec05-55b7-000000000180 22225 1726882754.55935: WORKER PROCESS EXITING 22225 1726882754.55959: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882754.56331: done with get_vars() 22225 1726882754.56457: done getting variables 22225 1726882754.56519: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Friday 20 September 2024 21:39:14 -0400 (0:00:00.057) 0:00:09.960 ****** 22225 1726882754.56842: entering _queue_task() for managed_node1/package 22225 1726882754.57272: worker is 1 (out of 1 available) 22225 1726882754.57289: exiting _queue_task() for managed_node1/package 22225 1726882754.57303: done queuing things up, now waiting for results queue to drain 22225 1726882754.57305: waiting for pending results... 22225 1726882754.57759: running TaskExecutor() for managed_node1/TASK: Install iproute 22225 1726882754.57895: in run() - task 0affc7ec-ae25-ec05-55b7-000000000159 22225 1726882754.58230: variable 'ansible_search_path' from source: unknown 22225 1726882754.58234: variable 'ansible_search_path' from source: unknown 22225 1726882754.58237: calling self._execute() 22225 1726882754.58371: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882754.58509: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882754.58512: variable 'omit' from source: magic vars 22225 1726882754.59500: variable 'ansible_distribution_major_version' from source: facts 22225 1726882754.59504: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882754.59507: variable 'omit' from source: magic vars 22225 1726882754.59509: variable 'omit' from source: magic vars 22225 1726882754.59938: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22225 1726882754.62777: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22225 1726882754.62862: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22225 1726882754.62906: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22225 1726882754.62948: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22225 1726882754.62982: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22225 1726882754.63094: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22225 1726882754.63228: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22225 1726882754.63231: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22225 1726882754.63235: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22225 1726882754.63237: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22225 1726882754.63338: variable '__network_is_ostree' from source: set_fact 22225 1726882754.63349: variable 'omit' from source: magic vars 22225 1726882754.63386: variable 'omit' from source: magic vars 22225 1726882754.63420: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22225 1726882754.63457: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22225 1726882754.63491: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22225 1726882754.63508: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882754.63519: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882754.63556: variable 'inventory_hostname' from source: host vars for 'managed_node1' 22225 1726882754.63559: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882754.63562: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882754.63675: Set connection var ansible_connection to ssh 22225 1726882754.63697: Set connection var ansible_pipelining to False 22225 1726882754.63705: Set connection var ansible_shell_executable to /bin/sh 22225 1726882754.63711: Set connection var ansible_timeout to 10 22225 1726882754.63714: Set connection var ansible_shell_type to sh 22225 1726882754.63926: Set connection var ansible_module_compression to ZIP_DEFLATED 22225 1726882754.63930: variable 'ansible_shell_executable' from source: unknown 22225 1726882754.63933: variable 'ansible_connection' from source: unknown 22225 1726882754.63935: variable 'ansible_module_compression' from source: unknown 22225 1726882754.63937: variable 'ansible_shell_type' from source: unknown 22225 1726882754.63939: variable 'ansible_shell_executable' from source: unknown 22225 1726882754.63941: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882754.63943: variable 'ansible_pipelining' from source: unknown 22225 1726882754.63946: variable 'ansible_timeout' from source: unknown 22225 1726882754.63948: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882754.63951: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22225 1726882754.63954: variable 'omit' from source: magic vars 22225 1726882754.63956: starting attempt loop 22225 1726882754.63959: running the handler 22225 1726882754.63961: variable 'ansible_facts' from source: unknown 22225 1726882754.63963: variable 'ansible_facts' from source: unknown 22225 1726882754.63966: _low_level_execute_command(): starting 22225 1726882754.63968: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22225 1726882754.64769: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22225 1726882754.64838: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882754.64902: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882754.64918: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882754.64942: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882754.65031: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882754.66845: stdout chunk (state=3): >>>/root <<< 22225 1726882754.67033: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882754.67044: stdout chunk (state=3): >>><<< 22225 1726882754.67055: stderr chunk (state=3): >>><<< 22225 1726882754.67092: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882754.67117: _low_level_execute_command(): starting 22225 1726882754.67311: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882754.671047-22590-127538225338399 `" && echo ansible-tmp-1726882754.671047-22590-127538225338399="` echo /root/.ansible/tmp/ansible-tmp-1726882754.671047-22590-127538225338399 `" ) && sleep 0' 22225 1726882754.67871: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22225 1726882754.68138: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882754.68159: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882754.68242: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882754.70227: stdout chunk (state=3): >>>ansible-tmp-1726882754.671047-22590-127538225338399=/root/.ansible/tmp/ansible-tmp-1726882754.671047-22590-127538225338399 <<< 22225 1726882754.70405: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882754.70420: stdout chunk (state=3): >>><<< 22225 1726882754.70441: stderr chunk (state=3): >>><<< 22225 1726882754.70462: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882754.671047-22590-127538225338399=/root/.ansible/tmp/ansible-tmp-1726882754.671047-22590-127538225338399 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882754.70497: variable 'ansible_module_compression' from source: unknown 22225 1726882754.70574: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 22225 1726882754.70582: ANSIBALLZ: Acquiring lock 22225 1726882754.70590: ANSIBALLZ: Lock acquired: 140272895053888 22225 1726882754.70597: ANSIBALLZ: Creating module 22225 1726882754.89348: ANSIBALLZ: Writing module into payload 22225 1726882754.89620: ANSIBALLZ: Writing module 22225 1726882754.89626: ANSIBALLZ: Renaming module 22225 1726882754.89629: ANSIBALLZ: Done creating module 22225 1726882754.89642: variable 'ansible_facts' from source: unknown 22225 1726882754.89755: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882754.671047-22590-127538225338399/AnsiballZ_dnf.py 22225 1726882754.90052: Sending initial data 22225 1726882754.90055: Sent initial data (151 bytes) 22225 1726882754.90666: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22225 1726882754.90742: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882754.90802: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882754.90817: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882754.90848: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882754.90933: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882754.92674: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22225 1726882754.92758: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22225 1726882754.92784: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2222502xecypi/tmpf34yt26v /root/.ansible/tmp/ansible-tmp-1726882754.671047-22590-127538225338399/AnsiballZ_dnf.py <<< 22225 1726882754.92787: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882754.671047-22590-127538225338399/AnsiballZ_dnf.py" <<< 22225 1726882754.92827: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-2222502xecypi/tmpf34yt26v" to remote "/root/.ansible/tmp/ansible-tmp-1726882754.671047-22590-127538225338399/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882754.671047-22590-127538225338399/AnsiballZ_dnf.py" <<< 22225 1726882754.93939: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882754.93942: stdout chunk (state=3): >>><<< 22225 1726882754.94028: stderr chunk (state=3): >>><<< 22225 1726882754.94031: done transferring module to remote 22225 1726882754.94035: _low_level_execute_command(): starting 22225 1726882754.94038: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882754.671047-22590-127538225338399/ /root/.ansible/tmp/ansible-tmp-1726882754.671047-22590-127538225338399/AnsiballZ_dnf.py && sleep 0' 22225 1726882754.94652: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22225 1726882754.94673: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882754.94687: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882754.94706: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882754.94721: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 <<< 22225 1726882754.94789: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882754.94837: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882754.94854: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882754.94873: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882754.94958: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882754.96933: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882754.96937: stdout chunk (state=3): >>><<< 22225 1726882754.96944: stderr chunk (state=3): >>><<< 22225 1726882754.96960: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882754.96963: _low_level_execute_command(): starting 22225 1726882754.96969: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882754.671047-22590-127538225338399/AnsiballZ_dnf.py && sleep 0' 22225 1726882754.97643: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22225 1726882754.97728: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882754.97731: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882754.97769: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882754.97781: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882754.97818: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882754.97899: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882756.05148: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 22225 1726882756.09861: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. <<< 22225 1726882756.09916: stderr chunk (state=3): >>><<< 22225 1726882756.09920: stdout chunk (state=3): >>><<< 22225 1726882756.09941: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. 22225 1726882756.09984: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882754.671047-22590-127538225338399/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22225 1726882756.09993: _low_level_execute_command(): starting 22225 1726882756.09995: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882754.671047-22590-127538225338399/ > /dev/null 2>&1 && sleep 0' 22225 1726882756.10468: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882756.10472: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882756.10474: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration <<< 22225 1726882756.10477: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882756.10479: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882756.10533: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882756.10546: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882756.10599: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882756.12495: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882756.12543: stderr chunk (state=3): >>><<< 22225 1726882756.12548: stdout chunk (state=3): >>><<< 22225 1726882756.12563: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882756.12566: handler run complete 22225 1726882756.12694: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22225 1726882756.12832: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22225 1726882756.12861: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22225 1726882756.12891: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22225 1726882756.12913: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22225 1726882756.12968: variable '__install_status' from source: unknown 22225 1726882756.12987: Evaluated conditional (__install_status is success): True 22225 1726882756.13000: attempt loop complete, returning result 22225 1726882756.13005: _execute() done 22225 1726882756.13007: dumping result to json 22225 1726882756.13013: done dumping result, returning 22225 1726882756.13020: done running TaskExecutor() for managed_node1/TASK: Install iproute [0affc7ec-ae25-ec05-55b7-000000000159] 22225 1726882756.13026: sending task result for task 0affc7ec-ae25-ec05-55b7-000000000159 22225 1726882756.13130: done sending task result for task 0affc7ec-ae25-ec05-55b7-000000000159 22225 1726882756.13133: WORKER PROCESS EXITING ok: [managed_node1] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 22225 1726882756.13217: no more pending results, returning what we have 22225 1726882756.13221: results queue empty 22225 1726882756.13223: checking for any_errors_fatal 22225 1726882756.13229: done checking for any_errors_fatal 22225 1726882756.13230: checking for max_fail_percentage 22225 1726882756.13231: done checking for max_fail_percentage 22225 1726882756.13232: checking to see if all hosts have failed and the running result is not ok 22225 1726882756.13233: done checking to see if all hosts have failed 22225 1726882756.13234: getting the remaining hosts for this loop 22225 1726882756.13236: done getting the remaining hosts for this loop 22225 1726882756.13240: getting the next task for host managed_node1 22225 1726882756.13246: done getting next task for host managed_node1 22225 1726882756.13248: ^ task is: TASK: Create veth interface {{ interface }} 22225 1726882756.13251: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882756.13254: getting variables 22225 1726882756.13256: in VariableManager get_vars() 22225 1726882756.13296: Calling all_inventory to load vars for managed_node1 22225 1726882756.13299: Calling groups_inventory to load vars for managed_node1 22225 1726882756.13301: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882756.13312: Calling all_plugins_play to load vars for managed_node1 22225 1726882756.13314: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882756.13317: Calling groups_plugins_play to load vars for managed_node1 22225 1726882756.13503: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882756.13647: done with get_vars() 22225 1726882756.13658: done getting variables 22225 1726882756.13703: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 22225 1726882756.13798: variable 'interface' from source: play vars TASK [Create veth interface veth0] ********************************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Friday 20 September 2024 21:39:16 -0400 (0:00:01.569) 0:00:11.530 ****** 22225 1726882756.13832: entering _queue_task() for managed_node1/command 22225 1726882756.14029: worker is 1 (out of 1 available) 22225 1726882756.14040: exiting _queue_task() for managed_node1/command 22225 1726882756.14050: done queuing things up, now waiting for results queue to drain 22225 1726882756.14050: waiting for pending results... 22225 1726882756.14270: running TaskExecutor() for managed_node1/TASK: Create veth interface veth0 22225 1726882756.14329: in run() - task 0affc7ec-ae25-ec05-55b7-00000000015a 22225 1726882756.14341: variable 'ansible_search_path' from source: unknown 22225 1726882756.14345: variable 'ansible_search_path' from source: unknown 22225 1726882756.14555: variable 'interface' from source: play vars 22225 1726882756.14616: variable 'interface' from source: play vars 22225 1726882756.14672: variable 'interface' from source: play vars 22225 1726882756.14859: Loaded config def from plugin (lookup/items) 22225 1726882756.14864: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 22225 1726882756.14878: variable 'omit' from source: magic vars 22225 1726882756.14954: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882756.14968: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882756.14978: variable 'omit' from source: magic vars 22225 1726882756.15138: variable 'ansible_distribution_major_version' from source: facts 22225 1726882756.15147: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882756.15278: variable 'type' from source: play vars 22225 1726882756.15285: variable 'state' from source: include params 22225 1726882756.15290: variable 'interface' from source: play vars 22225 1726882756.15294: variable 'current_interfaces' from source: set_fact 22225 1726882756.15297: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 22225 1726882756.15306: variable 'omit' from source: magic vars 22225 1726882756.15332: variable 'omit' from source: magic vars 22225 1726882756.15364: variable 'item' from source: unknown 22225 1726882756.15415: variable 'item' from source: unknown 22225 1726882756.15432: variable 'omit' from source: magic vars 22225 1726882756.15457: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22225 1726882756.15482: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22225 1726882756.15499: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22225 1726882756.15512: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882756.15583: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882756.15586: variable 'inventory_hostname' from source: host vars for 'managed_node1' 22225 1726882756.15589: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882756.15592: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882756.15634: Set connection var ansible_connection to ssh 22225 1726882756.15639: Set connection var ansible_pipelining to False 22225 1726882756.15647: Set connection var ansible_shell_executable to /bin/sh 22225 1726882756.15652: Set connection var ansible_timeout to 10 22225 1726882756.15655: Set connection var ansible_shell_type to sh 22225 1726882756.15660: Set connection var ansible_module_compression to ZIP_DEFLATED 22225 1726882756.15677: variable 'ansible_shell_executable' from source: unknown 22225 1726882756.15680: variable 'ansible_connection' from source: unknown 22225 1726882756.15688: variable 'ansible_module_compression' from source: unknown 22225 1726882756.15693: variable 'ansible_shell_type' from source: unknown 22225 1726882756.15696: variable 'ansible_shell_executable' from source: unknown 22225 1726882756.15698: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882756.15700: variable 'ansible_pipelining' from source: unknown 22225 1726882756.15703: variable 'ansible_timeout' from source: unknown 22225 1726882756.15705: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882756.15809: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22225 1726882756.15818: variable 'omit' from source: magic vars 22225 1726882756.15825: starting attempt loop 22225 1726882756.15828: running the handler 22225 1726882756.15843: _low_level_execute_command(): starting 22225 1726882756.15848: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22225 1726882756.16381: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882756.16385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882756.16390: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882756.16392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882756.16452: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882756.16455: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882756.16460: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882756.16515: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882756.18159: stdout chunk (state=3): >>>/root <<< 22225 1726882756.18266: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882756.18315: stderr chunk (state=3): >>><<< 22225 1726882756.18318: stdout chunk (state=3): >>><<< 22225 1726882756.18341: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882756.18354: _low_level_execute_command(): starting 22225 1726882756.18359: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882756.183401-22652-6703298960177 `" && echo ansible-tmp-1726882756.183401-22652-6703298960177="` echo /root/.ansible/tmp/ansible-tmp-1726882756.183401-22652-6703298960177 `" ) && sleep 0' 22225 1726882756.18815: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882756.18818: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882756.18821: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882756.18827: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found <<< 22225 1726882756.18829: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882756.18886: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882756.18889: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882756.18891: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882756.18940: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882756.26399: stdout chunk (state=3): >>>ansible-tmp-1726882756.183401-22652-6703298960177=/root/.ansible/tmp/ansible-tmp-1726882756.183401-22652-6703298960177 <<< 22225 1726882756.26615: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882756.26618: stdout chunk (state=3): >>><<< 22225 1726882756.26621: stderr chunk (state=3): >>><<< 22225 1726882756.26642: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882756.183401-22652-6703298960177=/root/.ansible/tmp/ansible-tmp-1726882756.183401-22652-6703298960177 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882756.26686: variable 'ansible_module_compression' from source: unknown 22225 1726882756.26828: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-2222502xecypi/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 22225 1726882756.26831: variable 'ansible_facts' from source: unknown 22225 1726882756.26892: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882756.183401-22652-6703298960177/AnsiballZ_command.py 22225 1726882756.27079: Sending initial data 22225 1726882756.27085: Sent initial data (153 bytes) 22225 1726882756.27634: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882756.27649: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882756.27660: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882756.27702: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882756.27727: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882756.27767: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882756.29380: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22225 1726882756.29459: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22225 1726882756.29534: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2222502xecypi/tmpdutqn9du /root/.ansible/tmp/ansible-tmp-1726882756.183401-22652-6703298960177/AnsiballZ_command.py <<< 22225 1726882756.29545: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882756.183401-22652-6703298960177/AnsiballZ_command.py" <<< 22225 1726882756.29829: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-2222502xecypi/tmpdutqn9du" to remote "/root/.ansible/tmp/ansible-tmp-1726882756.183401-22652-6703298960177/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882756.183401-22652-6703298960177/AnsiballZ_command.py" <<< 22225 1726882756.30681: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882756.30694: stdout chunk (state=3): >>><<< 22225 1726882756.30705: stderr chunk (state=3): >>><<< 22225 1726882756.30734: done transferring module to remote 22225 1726882756.30749: _low_level_execute_command(): starting 22225 1726882756.30759: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882756.183401-22652-6703298960177/ /root/.ansible/tmp/ansible-tmp-1726882756.183401-22652-6703298960177/AnsiballZ_command.py && sleep 0' 22225 1726882756.31380: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22225 1726882756.31398: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882756.31437: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882756.31452: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882756.31467: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882756.31554: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882756.31592: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882756.31640: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882756.33546: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882756.33569: stderr chunk (state=3): >>><<< 22225 1726882756.33584: stdout chunk (state=3): >>><<< 22225 1726882756.33608: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882756.33645: _low_level_execute_command(): starting 22225 1726882756.33649: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882756.183401-22652-6703298960177/AnsiballZ_command.py && sleep 0' 22225 1726882756.34296: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22225 1726882756.34343: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882756.34366: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 22225 1726882756.34370: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.7 is address <<< 22225 1726882756.34443: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882756.34476: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882756.34498: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882756.34519: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882756.34608: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882756.52648: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "veth0", "type", "veth", "peer", "name", "peerveth0"], "start": "2024-09-20 21:39:16.509189", "end": "2024-09-20 21:39:16.516577", "delta": "0:00:00.007388", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add veth0 type veth peer name peerveth0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 22225 1726882756.54399: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882756.54413: stderr chunk (state=3): >>>Shared connection to 10.31.15.7 closed. <<< 22225 1726882756.54591: stderr chunk (state=3): >>><<< 22225 1726882756.54816: stdout chunk (state=3): >>><<< 22225 1726882756.54820: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "veth0", "type", "veth", "peer", "name", "peerveth0"], "start": "2024-09-20 21:39:16.509189", "end": "2024-09-20 21:39:16.516577", "delta": "0:00:00.007388", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add veth0 type veth peer name peerveth0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. 22225 1726882756.54825: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link add veth0 type veth peer name peerveth0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882756.183401-22652-6703298960177/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22225 1726882756.54833: _low_level_execute_command(): starting 22225 1726882756.54836: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882756.183401-22652-6703298960177/ > /dev/null 2>&1 && sleep 0' 22225 1726882756.56162: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882756.56235: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882756.56310: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882756.56443: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882756.56448: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882756.56639: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882756.61584: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882756.61759: stderr chunk (state=3): >>><<< 22225 1726882756.61768: stdout chunk (state=3): >>><<< 22225 1726882756.61796: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882756.61808: handler run complete 22225 1726882756.61839: Evaluated conditional (False): False 22225 1726882756.61910: attempt loop complete, returning result 22225 1726882756.61942: variable 'item' from source: unknown 22225 1726882756.62088: variable 'item' from source: unknown ok: [managed_node1] => (item=ip link add veth0 type veth peer name peerveth0) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "add", "veth0", "type", "veth", "peer", "name", "peerveth0" ], "delta": "0:00:00.007388", "end": "2024-09-20 21:39:16.516577", "item": "ip link add veth0 type veth peer name peerveth0", "rc": 0, "start": "2024-09-20 21:39:16.509189" } 22225 1726882756.62729: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882756.62735: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882756.62738: variable 'omit' from source: magic vars 22225 1726882756.62740: variable 'ansible_distribution_major_version' from source: facts 22225 1726882756.62743: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882756.62745: variable 'type' from source: play vars 22225 1726882756.62747: variable 'state' from source: include params 22225 1726882756.62749: variable 'interface' from source: play vars 22225 1726882756.62752: variable 'current_interfaces' from source: set_fact 22225 1726882756.62754: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 22225 1726882756.62755: variable 'omit' from source: magic vars 22225 1726882756.62758: variable 'omit' from source: magic vars 22225 1726882756.62760: variable 'item' from source: unknown 22225 1726882756.62762: variable 'item' from source: unknown 22225 1726882756.62763: variable 'omit' from source: magic vars 22225 1726882756.62766: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22225 1726882756.62770: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882756.62782: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882756.62799: variable 'inventory_hostname' from source: host vars for 'managed_node1' 22225 1726882756.62806: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882756.62812: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882756.63129: Set connection var ansible_connection to ssh 22225 1726882756.63132: Set connection var ansible_pipelining to False 22225 1726882756.63135: Set connection var ansible_shell_executable to /bin/sh 22225 1726882756.63137: Set connection var ansible_timeout to 10 22225 1726882756.63140: Set connection var ansible_shell_type to sh 22225 1726882756.63142: Set connection var ansible_module_compression to ZIP_DEFLATED 22225 1726882756.63144: variable 'ansible_shell_executable' from source: unknown 22225 1726882756.63146: variable 'ansible_connection' from source: unknown 22225 1726882756.63149: variable 'ansible_module_compression' from source: unknown 22225 1726882756.63151: variable 'ansible_shell_type' from source: unknown 22225 1726882756.63153: variable 'ansible_shell_executable' from source: unknown 22225 1726882756.63155: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882756.63157: variable 'ansible_pipelining' from source: unknown 22225 1726882756.63160: variable 'ansible_timeout' from source: unknown 22225 1726882756.63162: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882756.63206: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22225 1726882756.63224: variable 'omit' from source: magic vars 22225 1726882756.63235: starting attempt loop 22225 1726882756.63243: running the handler 22225 1726882756.63255: _low_level_execute_command(): starting 22225 1726882756.63263: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22225 1726882756.64539: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22225 1726882756.64940: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882756.64949: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882756.64952: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882756.65041: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882756.66696: stdout chunk (state=3): >>>/root <<< 22225 1726882756.66933: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882756.66943: stdout chunk (state=3): >>><<< 22225 1726882756.66950: stderr chunk (state=3): >>><<< 22225 1726882756.66967: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882756.66977: _low_level_execute_command(): starting 22225 1726882756.66986: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882756.6696773-22652-256562269874791 `" && echo ansible-tmp-1726882756.6696773-22652-256562269874791="` echo /root/.ansible/tmp/ansible-tmp-1726882756.6696773-22652-256562269874791 `" ) && sleep 0' 22225 1726882756.68742: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882756.68746: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882756.68838: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882756.70805: stdout chunk (state=3): >>>ansible-tmp-1726882756.6696773-22652-256562269874791=/root/.ansible/tmp/ansible-tmp-1726882756.6696773-22652-256562269874791 <<< 22225 1726882756.71117: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882756.71121: stderr chunk (state=3): >>><<< 22225 1726882756.71129: stdout chunk (state=3): >>><<< 22225 1726882756.71159: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882756.6696773-22652-256562269874791=/root/.ansible/tmp/ansible-tmp-1726882756.6696773-22652-256562269874791 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882756.71196: variable 'ansible_module_compression' from source: unknown 22225 1726882756.71291: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-2222502xecypi/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 22225 1726882756.71383: variable 'ansible_facts' from source: unknown 22225 1726882756.71586: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882756.6696773-22652-256562269874791/AnsiballZ_command.py 22225 1726882756.72057: Sending initial data 22225 1726882756.72060: Sent initial data (156 bytes) 22225 1726882756.73830: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882756.73834: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882756.74038: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882756.75543: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22225 1726882756.75645: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22225 1726882756.75711: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2222502xecypi/tmp63vp5yj1 /root/.ansible/tmp/ansible-tmp-1726882756.6696773-22652-256562269874791/AnsiballZ_command.py <<< 22225 1726882756.75726: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882756.6696773-22652-256562269874791/AnsiballZ_command.py" <<< 22225 1726882756.75940: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 22225 1726882756.75943: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-2222502xecypi/tmp63vp5yj1" to remote "/root/.ansible/tmp/ansible-tmp-1726882756.6696773-22652-256562269874791/AnsiballZ_command.py" <<< 22225 1726882756.75945: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882756.6696773-22652-256562269874791/AnsiballZ_command.py" <<< 22225 1726882756.77193: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882756.77289: stderr chunk (state=3): >>><<< 22225 1726882756.77293: stdout chunk (state=3): >>><<< 22225 1726882756.77316: done transferring module to remote 22225 1726882756.77328: _low_level_execute_command(): starting 22225 1726882756.77333: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882756.6696773-22652-256562269874791/ /root/.ansible/tmp/ansible-tmp-1726882756.6696773-22652-256562269874791/AnsiballZ_command.py && sleep 0' 22225 1726882756.78353: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22225 1726882756.78538: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882756.78549: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882756.78564: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882756.78576: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 <<< 22225 1726882756.78587: stderr chunk (state=3): >>>debug2: match not found <<< 22225 1726882756.78597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882756.78612: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 22225 1726882756.78619: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.7 is address <<< 22225 1726882756.78626: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 22225 1726882756.78713: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882756.78846: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882756.78949: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882756.80743: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882756.81041: stderr chunk (state=3): >>><<< 22225 1726882756.81045: stdout chunk (state=3): >>><<< 22225 1726882756.81047: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882756.81050: _low_level_execute_command(): starting 22225 1726882756.81052: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882756.6696773-22652-256562269874791/AnsiballZ_command.py && sleep 0' 22225 1726882756.82192: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22225 1726882756.82205: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882756.82220: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882756.82332: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882756.82600: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882756.82691: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882756.99754: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerveth0", "up"], "start": "2024-09-20 21:39:16.991980", "end": "2024-09-20 21:39:16.995887", "delta": "0:00:00.003907", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerveth0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 22225 1726882757.01431: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. <<< 22225 1726882757.01443: stdout chunk (state=3): >>><<< 22225 1726882757.01456: stderr chunk (state=3): >>><<< 22225 1726882757.01492: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerveth0", "up"], "start": "2024-09-20 21:39:16.991980", "end": "2024-09-20 21:39:16.995887", "delta": "0:00:00.003907", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerveth0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. 22225 1726882757.01548: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set peerveth0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882756.6696773-22652-256562269874791/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22225 1726882757.01561: _low_level_execute_command(): starting 22225 1726882757.01574: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882756.6696773-22652-256562269874791/ > /dev/null 2>&1 && sleep 0' 22225 1726882757.02331: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882757.02336: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration <<< 22225 1726882757.02347: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found <<< 22225 1726882757.02370: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882757.02419: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882757.02442: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882757.02496: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882757.04536: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882757.04539: stdout chunk (state=3): >>><<< 22225 1726882757.04541: stderr chunk (state=3): >>><<< 22225 1726882757.04605: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882757.04610: handler run complete 22225 1726882757.04612: Evaluated conditional (False): False 22225 1726882757.04614: attempt loop complete, returning result 22225 1726882757.04635: variable 'item' from source: unknown 22225 1726882757.04718: variable 'item' from source: unknown ok: [managed_node1] => (item=ip link set peerveth0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "peerveth0", "up" ], "delta": "0:00:00.003907", "end": "2024-09-20 21:39:16.995887", "item": "ip link set peerveth0 up", "rc": 0, "start": "2024-09-20 21:39:16.991980" } 22225 1726882757.04853: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882757.04856: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882757.04859: variable 'omit' from source: magic vars 22225 1726882757.05150: variable 'ansible_distribution_major_version' from source: facts 22225 1726882757.05154: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882757.05269: variable 'type' from source: play vars 22225 1726882757.05279: variable 'state' from source: include params 22225 1726882757.05288: variable 'interface' from source: play vars 22225 1726882757.05295: variable 'current_interfaces' from source: set_fact 22225 1726882757.05305: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 22225 1726882757.05313: variable 'omit' from source: magic vars 22225 1726882757.05336: variable 'omit' from source: magic vars 22225 1726882757.05392: variable 'item' from source: unknown 22225 1726882757.05463: variable 'item' from source: unknown 22225 1726882757.05491: variable 'omit' from source: magic vars 22225 1726882757.05515: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22225 1726882757.05532: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882757.05544: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882757.05562: variable 'inventory_hostname' from source: host vars for 'managed_node1' 22225 1726882757.05570: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882757.05585: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882757.05735: Set connection var ansible_connection to ssh 22225 1726882757.05751: Set connection var ansible_pipelining to False 22225 1726882757.05763: Set connection var ansible_shell_executable to /bin/sh 22225 1726882757.05779: Set connection var ansible_timeout to 10 22225 1726882757.05806: Set connection var ansible_shell_type to sh 22225 1726882757.05824: Set connection var ansible_module_compression to ZIP_DEFLATED 22225 1726882757.05912: variable 'ansible_shell_executable' from source: unknown 22225 1726882757.05916: variable 'ansible_connection' from source: unknown 22225 1726882757.05918: variable 'ansible_module_compression' from source: unknown 22225 1726882757.05921: variable 'ansible_shell_type' from source: unknown 22225 1726882757.05926: variable 'ansible_shell_executable' from source: unknown 22225 1726882757.05929: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882757.05931: variable 'ansible_pipelining' from source: unknown 22225 1726882757.05987: variable 'ansible_timeout' from source: unknown 22225 1726882757.05990: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882757.06042: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22225 1726882757.06109: variable 'omit' from source: magic vars 22225 1726882757.06112: starting attempt loop 22225 1726882757.06115: running the handler 22225 1726882757.06117: _low_level_execute_command(): starting 22225 1726882757.06120: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22225 1726882757.06742: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882757.06768: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882757.06858: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882757.06906: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882757.08547: stdout chunk (state=3): >>>/root <<< 22225 1726882757.08653: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882757.08700: stderr chunk (state=3): >>><<< 22225 1726882757.08703: stdout chunk (state=3): >>><<< 22225 1726882757.08716: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882757.08731: _low_level_execute_command(): starting 22225 1726882757.08738: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882757.0871563-22652-147299955133495 `" && echo ansible-tmp-1726882757.0871563-22652-147299955133495="` echo /root/.ansible/tmp/ansible-tmp-1726882757.0871563-22652-147299955133495 `" ) && sleep 0' 22225 1726882757.09231: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882757.09260: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882757.09264: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882757.09342: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882757.09393: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882757.11329: stdout chunk (state=3): >>>ansible-tmp-1726882757.0871563-22652-147299955133495=/root/.ansible/tmp/ansible-tmp-1726882757.0871563-22652-147299955133495 <<< 22225 1726882757.11448: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882757.11499: stderr chunk (state=3): >>><<< 22225 1726882757.11502: stdout chunk (state=3): >>><<< 22225 1726882757.11517: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882757.0871563-22652-147299955133495=/root/.ansible/tmp/ansible-tmp-1726882757.0871563-22652-147299955133495 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882757.11538: variable 'ansible_module_compression' from source: unknown 22225 1726882757.11566: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-2222502xecypi/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 22225 1726882757.11584: variable 'ansible_facts' from source: unknown 22225 1726882757.11628: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882757.0871563-22652-147299955133495/AnsiballZ_command.py 22225 1726882757.11717: Sending initial data 22225 1726882757.11721: Sent initial data (156 bytes) 22225 1726882757.12179: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882757.12183: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found <<< 22225 1726882757.12187: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 22225 1726882757.12189: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882757.12195: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882757.12241: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882757.12258: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882757.12301: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882757.13885: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22225 1726882757.13934: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22225 1726882757.13990: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2222502xecypi/tmp767tedrn /root/.ansible/tmp/ansible-tmp-1726882757.0871563-22652-147299955133495/AnsiballZ_command.py <<< 22225 1726882757.13993: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882757.0871563-22652-147299955133495/AnsiballZ_command.py" <<< 22225 1726882757.14036: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-2222502xecypi/tmp767tedrn" to remote "/root/.ansible/tmp/ansible-tmp-1726882757.0871563-22652-147299955133495/AnsiballZ_command.py" <<< 22225 1726882757.14040: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882757.0871563-22652-147299955133495/AnsiballZ_command.py" <<< 22225 1726882757.14643: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882757.14708: stderr chunk (state=3): >>><<< 22225 1726882757.14711: stdout chunk (state=3): >>><<< 22225 1726882757.14730: done transferring module to remote 22225 1726882757.14737: _low_level_execute_command(): starting 22225 1726882757.14742: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882757.0871563-22652-147299955133495/ /root/.ansible/tmp/ansible-tmp-1726882757.0871563-22652-147299955133495/AnsiballZ_command.py && sleep 0' 22225 1726882757.15185: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882757.15191: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found <<< 22225 1726882757.15193: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882757.15196: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882757.15199: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882757.15250: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882757.15254: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882757.15310: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882757.17127: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882757.17173: stderr chunk (state=3): >>><<< 22225 1726882757.17177: stdout chunk (state=3): >>><<< 22225 1726882757.17194: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882757.17197: _low_level_execute_command(): starting 22225 1726882757.17199: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882757.0871563-22652-147299955133495/AnsiballZ_command.py && sleep 0' 22225 1726882757.17636: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882757.17640: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found <<< 22225 1726882757.17642: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882757.17645: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882757.17647: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882757.17695: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882757.17712: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882757.17757: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882757.34857: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "veth0", "up"], "start": "2024-09-20 21:39:17.341937", "end": "2024-09-20 21:39:17.345951", "delta": "0:00:00.004014", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set veth0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 22225 1726882757.36394: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. <<< 22225 1726882757.36458: stderr chunk (state=3): >>><<< 22225 1726882757.36462: stdout chunk (state=3): >>><<< 22225 1726882757.36476: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "veth0", "up"], "start": "2024-09-20 21:39:17.341937", "end": "2024-09-20 21:39:17.345951", "delta": "0:00:00.004014", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set veth0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. 22225 1726882757.36500: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set veth0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882757.0871563-22652-147299955133495/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22225 1726882757.36505: _low_level_execute_command(): starting 22225 1726882757.36513: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882757.0871563-22652-147299955133495/ > /dev/null 2>&1 && sleep 0' 22225 1726882757.36999: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882757.37003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882757.37005: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration <<< 22225 1726882757.37007: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882757.37010: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882757.37067: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882757.37070: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882757.37071: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882757.37116: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882757.39008: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882757.39056: stderr chunk (state=3): >>><<< 22225 1726882757.39060: stdout chunk (state=3): >>><<< 22225 1726882757.39072: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882757.39077: handler run complete 22225 1726882757.39096: Evaluated conditional (False): False 22225 1726882757.39105: attempt loop complete, returning result 22225 1726882757.39121: variable 'item' from source: unknown 22225 1726882757.39186: variable 'item' from source: unknown ok: [managed_node1] => (item=ip link set veth0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "veth0", "up" ], "delta": "0:00:00.004014", "end": "2024-09-20 21:39:17.345951", "item": "ip link set veth0 up", "rc": 0, "start": "2024-09-20 21:39:17.341937" } 22225 1726882757.39368: dumping result to json 22225 1726882757.39371: done dumping result, returning 22225 1726882757.39373: done running TaskExecutor() for managed_node1/TASK: Create veth interface veth0 [0affc7ec-ae25-ec05-55b7-00000000015a] 22225 1726882757.39374: sending task result for task 0affc7ec-ae25-ec05-55b7-00000000015a 22225 1726882757.39420: done sending task result for task 0affc7ec-ae25-ec05-55b7-00000000015a 22225 1726882757.39490: no more pending results, returning what we have 22225 1726882757.39494: results queue empty 22225 1726882757.39495: checking for any_errors_fatal 22225 1726882757.39499: done checking for any_errors_fatal 22225 1726882757.39500: checking for max_fail_percentage 22225 1726882757.39501: done checking for max_fail_percentage 22225 1726882757.39502: checking to see if all hosts have failed and the running result is not ok 22225 1726882757.39503: done checking to see if all hosts have failed 22225 1726882757.39503: getting the remaining hosts for this loop 22225 1726882757.39505: done getting the remaining hosts for this loop 22225 1726882757.39509: getting the next task for host managed_node1 22225 1726882757.39514: done getting next task for host managed_node1 22225 1726882757.39516: ^ task is: TASK: Set up veth as managed by NetworkManager 22225 1726882757.39526: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882757.39530: getting variables 22225 1726882757.39531: in VariableManager get_vars() 22225 1726882757.39560: Calling all_inventory to load vars for managed_node1 22225 1726882757.39563: Calling groups_inventory to load vars for managed_node1 22225 1726882757.39565: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882757.39575: Calling all_plugins_play to load vars for managed_node1 22225 1726882757.39577: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882757.39580: Calling groups_plugins_play to load vars for managed_node1 22225 1726882757.39706: WORKER PROCESS EXITING 22225 1726882757.39718: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882757.39861: done with get_vars() 22225 1726882757.39869: done getting variables 22225 1726882757.39916: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Friday 20 September 2024 21:39:17 -0400 (0:00:01.261) 0:00:12.792 ****** 22225 1726882757.39940: entering _queue_task() for managed_node1/command 22225 1726882757.40149: worker is 1 (out of 1 available) 22225 1726882757.40163: exiting _queue_task() for managed_node1/command 22225 1726882757.40175: done queuing things up, now waiting for results queue to drain 22225 1726882757.40176: waiting for pending results... 22225 1726882757.40332: running TaskExecutor() for managed_node1/TASK: Set up veth as managed by NetworkManager 22225 1726882757.40400: in run() - task 0affc7ec-ae25-ec05-55b7-00000000015b 22225 1726882757.40414: variable 'ansible_search_path' from source: unknown 22225 1726882757.40417: variable 'ansible_search_path' from source: unknown 22225 1726882757.40450: calling self._execute() 22225 1726882757.40515: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882757.40519: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882757.40532: variable 'omit' from source: magic vars 22225 1726882757.40800: variable 'ansible_distribution_major_version' from source: facts 22225 1726882757.40810: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882757.40920: variable 'type' from source: play vars 22225 1726882757.40925: variable 'state' from source: include params 22225 1726882757.40931: Evaluated conditional (type == 'veth' and state == 'present'): True 22225 1726882757.40937: variable 'omit' from source: magic vars 22225 1726882757.40970: variable 'omit' from source: magic vars 22225 1726882757.41036: variable 'interface' from source: play vars 22225 1726882757.41049: variable 'omit' from source: magic vars 22225 1726882757.41086: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22225 1726882757.41112: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22225 1726882757.41130: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22225 1726882757.41171: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882757.41188: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882757.41207: variable 'inventory_hostname' from source: host vars for 'managed_node1' 22225 1726882757.41211: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882757.41213: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882757.41285: Set connection var ansible_connection to ssh 22225 1726882757.41293: Set connection var ansible_pipelining to False 22225 1726882757.41301: Set connection var ansible_shell_executable to /bin/sh 22225 1726882757.41307: Set connection var ansible_timeout to 10 22225 1726882757.41310: Set connection var ansible_shell_type to sh 22225 1726882757.41313: Set connection var ansible_module_compression to ZIP_DEFLATED 22225 1726882757.41335: variable 'ansible_shell_executable' from source: unknown 22225 1726882757.41338: variable 'ansible_connection' from source: unknown 22225 1726882757.41340: variable 'ansible_module_compression' from source: unknown 22225 1726882757.41343: variable 'ansible_shell_type' from source: unknown 22225 1726882757.41345: variable 'ansible_shell_executable' from source: unknown 22225 1726882757.41348: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882757.41351: variable 'ansible_pipelining' from source: unknown 22225 1726882757.41354: variable 'ansible_timeout' from source: unknown 22225 1726882757.41359: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882757.41465: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22225 1726882757.41475: variable 'omit' from source: magic vars 22225 1726882757.41483: starting attempt loop 22225 1726882757.41486: running the handler 22225 1726882757.41497: _low_level_execute_command(): starting 22225 1726882757.41503: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22225 1726882757.42039: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882757.42043: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882757.42046: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882757.42048: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882757.42102: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882757.42105: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882757.42161: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882757.43799: stdout chunk (state=3): >>>/root <<< 22225 1726882757.43915: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882757.43963: stderr chunk (state=3): >>><<< 22225 1726882757.43967: stdout chunk (state=3): >>><<< 22225 1726882757.43986: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882757.43998: _low_level_execute_command(): starting 22225 1726882757.44003: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882757.439858-22705-68649223656184 `" && echo ansible-tmp-1726882757.439858-22705-68649223656184="` echo /root/.ansible/tmp/ansible-tmp-1726882757.439858-22705-68649223656184 `" ) && sleep 0' 22225 1726882757.44468: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882757.44472: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882757.44481: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882757.44483: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882757.44534: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882757.44538: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882757.44595: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882757.46539: stdout chunk (state=3): >>>ansible-tmp-1726882757.439858-22705-68649223656184=/root/.ansible/tmp/ansible-tmp-1726882757.439858-22705-68649223656184 <<< 22225 1726882757.46650: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882757.46696: stderr chunk (state=3): >>><<< 22225 1726882757.46700: stdout chunk (state=3): >>><<< 22225 1726882757.46714: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882757.439858-22705-68649223656184=/root/.ansible/tmp/ansible-tmp-1726882757.439858-22705-68649223656184 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882757.46743: variable 'ansible_module_compression' from source: unknown 22225 1726882757.46787: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-2222502xecypi/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 22225 1726882757.46816: variable 'ansible_facts' from source: unknown 22225 1726882757.46873: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882757.439858-22705-68649223656184/AnsiballZ_command.py 22225 1726882757.46974: Sending initial data 22225 1726882757.46977: Sent initial data (154 bytes) 22225 1726882757.47429: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882757.47432: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found <<< 22225 1726882757.47436: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 22225 1726882757.47439: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882757.47441: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882757.47488: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882757.47492: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882757.47553: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882757.49149: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22225 1726882757.49208: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22225 1726882757.49267: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2222502xecypi/tmphmflpkiv /root/.ansible/tmp/ansible-tmp-1726882757.439858-22705-68649223656184/AnsiballZ_command.py <<< 22225 1726882757.49270: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882757.439858-22705-68649223656184/AnsiballZ_command.py" <<< 22225 1726882757.49313: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-2222502xecypi/tmphmflpkiv" to remote "/root/.ansible/tmp/ansible-tmp-1726882757.439858-22705-68649223656184/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882757.439858-22705-68649223656184/AnsiballZ_command.py" <<< 22225 1726882757.50093: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882757.50145: stderr chunk (state=3): >>><<< 22225 1726882757.50156: stdout chunk (state=3): >>><<< 22225 1726882757.50190: done transferring module to remote 22225 1726882757.50206: _low_level_execute_command(): starting 22225 1726882757.50215: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882757.439858-22705-68649223656184/ /root/.ansible/tmp/ansible-tmp-1726882757.439858-22705-68649223656184/AnsiballZ_command.py && sleep 0' 22225 1726882757.50837: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22225 1726882757.50851: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882757.50872: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882757.50890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882757.50906: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 <<< 22225 1726882757.50990: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882757.51025: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882757.51048: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882757.51063: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882757.51150: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882757.53025: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882757.53036: stdout chunk (state=3): >>><<< 22225 1726882757.53048: stderr chunk (state=3): >>><<< 22225 1726882757.53068: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882757.53077: _low_level_execute_command(): starting 22225 1726882757.53090: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882757.439858-22705-68649223656184/AnsiballZ_command.py && sleep 0' 22225 1726882757.53750: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22225 1726882757.53764: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882757.53786: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882757.53898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882757.53940: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882757.53991: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882757.72468: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "veth0", "managed", "true"], "start": "2024-09-20 21:39:17.702287", "end": "2024-09-20 21:39:17.722439", "delta": "0:00:00.020152", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set veth0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 22225 1726882757.74041: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. <<< 22225 1726882757.74096: stderr chunk (state=3): >>><<< 22225 1726882757.74099: stdout chunk (state=3): >>><<< 22225 1726882757.74115: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "veth0", "managed", "true"], "start": "2024-09-20 21:39:17.702287", "end": "2024-09-20 21:39:17.722439", "delta": "0:00:00.020152", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set veth0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. 22225 1726882757.74147: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli d set veth0 managed true', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882757.439858-22705-68649223656184/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22225 1726882757.74157: _low_level_execute_command(): starting 22225 1726882757.74160: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882757.439858-22705-68649223656184/ > /dev/null 2>&1 && sleep 0' 22225 1726882757.74639: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882757.74643: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882757.74645: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882757.74652: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882757.74662: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882757.74743: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882757.74781: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882757.74840: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882757.76989: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882757.76995: stdout chunk (state=3): >>><<< 22225 1726882757.76998: stderr chunk (state=3): >>><<< 22225 1726882757.77228: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882757.77232: handler run complete 22225 1726882757.77236: Evaluated conditional (False): False 22225 1726882757.77239: attempt loop complete, returning result 22225 1726882757.77241: _execute() done 22225 1726882757.77243: dumping result to json 22225 1726882757.77245: done dumping result, returning 22225 1726882757.77247: done running TaskExecutor() for managed_node1/TASK: Set up veth as managed by NetworkManager [0affc7ec-ae25-ec05-55b7-00000000015b] 22225 1726882757.77249: sending task result for task 0affc7ec-ae25-ec05-55b7-00000000015b 22225 1726882757.77344: done sending task result for task 0affc7ec-ae25-ec05-55b7-00000000015b 22225 1726882757.77347: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "nmcli", "d", "set", "veth0", "managed", "true" ], "delta": "0:00:00.020152", "end": "2024-09-20 21:39:17.722439", "rc": 0, "start": "2024-09-20 21:39:17.702287" } 22225 1726882757.77434: no more pending results, returning what we have 22225 1726882757.77439: results queue empty 22225 1726882757.77440: checking for any_errors_fatal 22225 1726882757.77452: done checking for any_errors_fatal 22225 1726882757.77454: checking for max_fail_percentage 22225 1726882757.77456: done checking for max_fail_percentage 22225 1726882757.77462: checking to see if all hosts have failed and the running result is not ok 22225 1726882757.77464: done checking to see if all hosts have failed 22225 1726882757.77464: getting the remaining hosts for this loop 22225 1726882757.77466: done getting the remaining hosts for this loop 22225 1726882757.77471: getting the next task for host managed_node1 22225 1726882757.77479: done getting next task for host managed_node1 22225 1726882757.77485: ^ task is: TASK: Delete veth interface {{ interface }} 22225 1726882757.77488: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882757.77493: getting variables 22225 1726882757.77495: in VariableManager get_vars() 22225 1726882757.77653: Calling all_inventory to load vars for managed_node1 22225 1726882757.77657: Calling groups_inventory to load vars for managed_node1 22225 1726882757.77659: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882757.77673: Calling all_plugins_play to load vars for managed_node1 22225 1726882757.77755: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882757.77761: Calling groups_plugins_play to load vars for managed_node1 22225 1726882757.78220: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882757.78641: done with get_vars() 22225 1726882757.78653: done getting variables 22225 1726882757.78725: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 22225 1726882757.78869: variable 'interface' from source: play vars TASK [Delete veth interface veth0] ********************************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Friday 20 September 2024 21:39:17 -0400 (0:00:00.389) 0:00:13.181 ****** 22225 1726882757.78910: entering _queue_task() for managed_node1/command 22225 1726882757.79431: worker is 1 (out of 1 available) 22225 1726882757.79445: exiting _queue_task() for managed_node1/command 22225 1726882757.79455: done queuing things up, now waiting for results queue to drain 22225 1726882757.79457: waiting for pending results... 22225 1726882757.79843: running TaskExecutor() for managed_node1/TASK: Delete veth interface veth0 22225 1726882757.79848: in run() - task 0affc7ec-ae25-ec05-55b7-00000000015c 22225 1726882757.79851: variable 'ansible_search_path' from source: unknown 22225 1726882757.79854: variable 'ansible_search_path' from source: unknown 22225 1726882757.79867: calling self._execute() 22225 1726882757.79967: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882757.79982: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882757.79998: variable 'omit' from source: magic vars 22225 1726882757.80402: variable 'ansible_distribution_major_version' from source: facts 22225 1726882757.80425: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882757.80658: variable 'type' from source: play vars 22225 1726882757.80670: variable 'state' from source: include params 22225 1726882757.80680: variable 'interface' from source: play vars 22225 1726882757.80689: variable 'current_interfaces' from source: set_fact 22225 1726882757.80710: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): False 22225 1726882757.80719: when evaluation is False, skipping this task 22225 1726882757.80730: _execute() done 22225 1726882757.80843: dumping result to json 22225 1726882757.80847: done dumping result, returning 22225 1726882757.80850: done running TaskExecutor() for managed_node1/TASK: Delete veth interface veth0 [0affc7ec-ae25-ec05-55b7-00000000015c] 22225 1726882757.80852: sending task result for task 0affc7ec-ae25-ec05-55b7-00000000015c 22225 1726882757.81039: done sending task result for task 0affc7ec-ae25-ec05-55b7-00000000015c 22225 1726882757.81042: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'veth' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 22225 1726882757.81092: no more pending results, returning what we have 22225 1726882757.81097: results queue empty 22225 1726882757.81099: checking for any_errors_fatal 22225 1726882757.81106: done checking for any_errors_fatal 22225 1726882757.81106: checking for max_fail_percentage 22225 1726882757.81108: done checking for max_fail_percentage 22225 1726882757.81109: checking to see if all hosts have failed and the running result is not ok 22225 1726882757.81110: done checking to see if all hosts have failed 22225 1726882757.81111: getting the remaining hosts for this loop 22225 1726882757.81113: done getting the remaining hosts for this loop 22225 1726882757.81117: getting the next task for host managed_node1 22225 1726882757.81127: done getting next task for host managed_node1 22225 1726882757.81129: ^ task is: TASK: Create dummy interface {{ interface }} 22225 1726882757.81136: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882757.81141: getting variables 22225 1726882757.81143: in VariableManager get_vars() 22225 1726882757.81187: Calling all_inventory to load vars for managed_node1 22225 1726882757.81190: Calling groups_inventory to load vars for managed_node1 22225 1726882757.81192: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882757.81208: Calling all_plugins_play to load vars for managed_node1 22225 1726882757.81211: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882757.81215: Calling groups_plugins_play to load vars for managed_node1 22225 1726882757.81746: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882757.82000: done with get_vars() 22225 1726882757.82011: done getting variables 22225 1726882757.82079: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 22225 1726882757.82202: variable 'interface' from source: play vars TASK [Create dummy interface veth0] ******************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Friday 20 September 2024 21:39:17 -0400 (0:00:00.033) 0:00:13.215 ****** 22225 1726882757.82242: entering _queue_task() for managed_node1/command 22225 1726882757.82506: worker is 1 (out of 1 available) 22225 1726882757.82521: exiting _queue_task() for managed_node1/command 22225 1726882757.82654: done queuing things up, now waiting for results queue to drain 22225 1726882757.82655: waiting for pending results... 22225 1726882757.82730: running TaskExecutor() for managed_node1/TASK: Create dummy interface veth0 22225 1726882757.82797: in run() - task 0affc7ec-ae25-ec05-55b7-00000000015d 22225 1726882757.82809: variable 'ansible_search_path' from source: unknown 22225 1726882757.82812: variable 'ansible_search_path' from source: unknown 22225 1726882757.82849: calling self._execute() 22225 1726882757.82912: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882757.82917: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882757.82927: variable 'omit' from source: magic vars 22225 1726882757.83227: variable 'ansible_distribution_major_version' from source: facts 22225 1726882757.83331: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882757.83478: variable 'type' from source: play vars 22225 1726882757.83491: variable 'state' from source: include params 22225 1726882757.83501: variable 'interface' from source: play vars 22225 1726882757.83510: variable 'current_interfaces' from source: set_fact 22225 1726882757.83525: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 22225 1726882757.83534: when evaluation is False, skipping this task 22225 1726882757.83541: _execute() done 22225 1726882757.83548: dumping result to json 22225 1726882757.83567: done dumping result, returning 22225 1726882757.83579: done running TaskExecutor() for managed_node1/TASK: Create dummy interface veth0 [0affc7ec-ae25-ec05-55b7-00000000015d] 22225 1726882757.83589: sending task result for task 0affc7ec-ae25-ec05-55b7-00000000015d skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 22225 1726882757.83840: no more pending results, returning what we have 22225 1726882757.83844: results queue empty 22225 1726882757.83845: checking for any_errors_fatal 22225 1726882757.83851: done checking for any_errors_fatal 22225 1726882757.83852: checking for max_fail_percentage 22225 1726882757.83854: done checking for max_fail_percentage 22225 1726882757.83855: checking to see if all hosts have failed and the running result is not ok 22225 1726882757.83856: done checking to see if all hosts have failed 22225 1726882757.83856: getting the remaining hosts for this loop 22225 1726882757.83858: done getting the remaining hosts for this loop 22225 1726882757.83863: getting the next task for host managed_node1 22225 1726882757.83868: done getting next task for host managed_node1 22225 1726882757.83871: ^ task is: TASK: Delete dummy interface {{ interface }} 22225 1726882757.83875: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882757.83879: getting variables 22225 1726882757.83881: in VariableManager get_vars() 22225 1726882757.83923: Calling all_inventory to load vars for managed_node1 22225 1726882757.83926: Calling groups_inventory to load vars for managed_node1 22225 1726882757.83929: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882757.83942: Calling all_plugins_play to load vars for managed_node1 22225 1726882757.83945: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882757.83948: Calling groups_plugins_play to load vars for managed_node1 22225 1726882757.84284: done sending task result for task 0affc7ec-ae25-ec05-55b7-00000000015d 22225 1726882757.84292: WORKER PROCESS EXITING 22225 1726882757.84320: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882757.84459: done with get_vars() 22225 1726882757.84467: done getting variables 22225 1726882757.84508: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 22225 1726882757.84588: variable 'interface' from source: play vars TASK [Delete dummy interface veth0] ******************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Friday 20 September 2024 21:39:17 -0400 (0:00:00.023) 0:00:13.238 ****** 22225 1726882757.84608: entering _queue_task() for managed_node1/command 22225 1726882757.84788: worker is 1 (out of 1 available) 22225 1726882757.84803: exiting _queue_task() for managed_node1/command 22225 1726882757.84815: done queuing things up, now waiting for results queue to drain 22225 1726882757.84816: waiting for pending results... 22225 1726882757.84974: running TaskExecutor() for managed_node1/TASK: Delete dummy interface veth0 22225 1726882757.85028: in run() - task 0affc7ec-ae25-ec05-55b7-00000000015e 22225 1726882757.85041: variable 'ansible_search_path' from source: unknown 22225 1726882757.85046: variable 'ansible_search_path' from source: unknown 22225 1726882757.85074: calling self._execute() 22225 1726882757.85137: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882757.85143: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882757.85151: variable 'omit' from source: magic vars 22225 1726882757.85408: variable 'ansible_distribution_major_version' from source: facts 22225 1726882757.85418: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882757.85553: variable 'type' from source: play vars 22225 1726882757.85557: variable 'state' from source: include params 22225 1726882757.85562: variable 'interface' from source: play vars 22225 1726882757.85565: variable 'current_interfaces' from source: set_fact 22225 1726882757.85573: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 22225 1726882757.85576: when evaluation is False, skipping this task 22225 1726882757.85582: _execute() done 22225 1726882757.85586: dumping result to json 22225 1726882757.85588: done dumping result, returning 22225 1726882757.85592: done running TaskExecutor() for managed_node1/TASK: Delete dummy interface veth0 [0affc7ec-ae25-ec05-55b7-00000000015e] 22225 1726882757.85595: sending task result for task 0affc7ec-ae25-ec05-55b7-00000000015e 22225 1726882757.85686: done sending task result for task 0affc7ec-ae25-ec05-55b7-00000000015e 22225 1726882757.85691: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 22225 1726882757.85744: no more pending results, returning what we have 22225 1726882757.85747: results queue empty 22225 1726882757.85748: checking for any_errors_fatal 22225 1726882757.85752: done checking for any_errors_fatal 22225 1726882757.85752: checking for max_fail_percentage 22225 1726882757.85754: done checking for max_fail_percentage 22225 1726882757.85755: checking to see if all hosts have failed and the running result is not ok 22225 1726882757.85756: done checking to see if all hosts have failed 22225 1726882757.85756: getting the remaining hosts for this loop 22225 1726882757.85757: done getting the remaining hosts for this loop 22225 1726882757.85761: getting the next task for host managed_node1 22225 1726882757.85765: done getting next task for host managed_node1 22225 1726882757.85767: ^ task is: TASK: Create tap interface {{ interface }} 22225 1726882757.85770: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882757.85773: getting variables 22225 1726882757.85774: in VariableManager get_vars() 22225 1726882757.85804: Calling all_inventory to load vars for managed_node1 22225 1726882757.85806: Calling groups_inventory to load vars for managed_node1 22225 1726882757.85808: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882757.85815: Calling all_plugins_play to load vars for managed_node1 22225 1726882757.85817: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882757.85818: Calling groups_plugins_play to load vars for managed_node1 22225 1726882757.85938: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882757.86075: done with get_vars() 22225 1726882757.86084: done getting variables 22225 1726882757.86127: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 22225 1726882757.86203: variable 'interface' from source: play vars TASK [Create tap interface veth0] ********************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Friday 20 September 2024 21:39:17 -0400 (0:00:00.016) 0:00:13.254 ****** 22225 1726882757.86227: entering _queue_task() for managed_node1/command 22225 1726882757.86403: worker is 1 (out of 1 available) 22225 1726882757.86419: exiting _queue_task() for managed_node1/command 22225 1726882757.86432: done queuing things up, now waiting for results queue to drain 22225 1726882757.86434: waiting for pending results... 22225 1726882757.86568: running TaskExecutor() for managed_node1/TASK: Create tap interface veth0 22225 1726882757.86625: in run() - task 0affc7ec-ae25-ec05-55b7-00000000015f 22225 1726882757.86636: variable 'ansible_search_path' from source: unknown 22225 1726882757.86639: variable 'ansible_search_path' from source: unknown 22225 1726882757.86667: calling self._execute() 22225 1726882757.86728: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882757.86734: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882757.86742: variable 'omit' from source: magic vars 22225 1726882757.87017: variable 'ansible_distribution_major_version' from source: facts 22225 1726882757.87021: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882757.87139: variable 'type' from source: play vars 22225 1726882757.87143: variable 'state' from source: include params 22225 1726882757.87148: variable 'interface' from source: play vars 22225 1726882757.87151: variable 'current_interfaces' from source: set_fact 22225 1726882757.87159: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 22225 1726882757.87162: when evaluation is False, skipping this task 22225 1726882757.87164: _execute() done 22225 1726882757.87167: dumping result to json 22225 1726882757.87169: done dumping result, returning 22225 1726882757.87176: done running TaskExecutor() for managed_node1/TASK: Create tap interface veth0 [0affc7ec-ae25-ec05-55b7-00000000015f] 22225 1726882757.87183: sending task result for task 0affc7ec-ae25-ec05-55b7-00000000015f 22225 1726882757.87268: done sending task result for task 0affc7ec-ae25-ec05-55b7-00000000015f 22225 1726882757.87271: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 22225 1726882757.87345: no more pending results, returning what we have 22225 1726882757.87347: results queue empty 22225 1726882757.87348: checking for any_errors_fatal 22225 1726882757.87353: done checking for any_errors_fatal 22225 1726882757.87353: checking for max_fail_percentage 22225 1726882757.87355: done checking for max_fail_percentage 22225 1726882757.87356: checking to see if all hosts have failed and the running result is not ok 22225 1726882757.87357: done checking to see if all hosts have failed 22225 1726882757.87357: getting the remaining hosts for this loop 22225 1726882757.87358: done getting the remaining hosts for this loop 22225 1726882757.87361: getting the next task for host managed_node1 22225 1726882757.87364: done getting next task for host managed_node1 22225 1726882757.87366: ^ task is: TASK: Delete tap interface {{ interface }} 22225 1726882757.87368: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882757.87370: getting variables 22225 1726882757.87371: in VariableManager get_vars() 22225 1726882757.87401: Calling all_inventory to load vars for managed_node1 22225 1726882757.87403: Calling groups_inventory to load vars for managed_node1 22225 1726882757.87404: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882757.87412: Calling all_plugins_play to load vars for managed_node1 22225 1726882757.87413: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882757.87415: Calling groups_plugins_play to load vars for managed_node1 22225 1726882757.87563: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882757.87701: done with get_vars() 22225 1726882757.87707: done getting variables 22225 1726882757.87749: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 22225 1726882757.87820: variable 'interface' from source: play vars TASK [Delete tap interface veth0] ********************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Friday 20 September 2024 21:39:17 -0400 (0:00:00.016) 0:00:13.271 ****** 22225 1726882757.87843: entering _queue_task() for managed_node1/command 22225 1726882757.88020: worker is 1 (out of 1 available) 22225 1726882757.88038: exiting _queue_task() for managed_node1/command 22225 1726882757.88049: done queuing things up, now waiting for results queue to drain 22225 1726882757.88051: waiting for pending results... 22225 1726882757.88186: running TaskExecutor() for managed_node1/TASK: Delete tap interface veth0 22225 1726882757.88239: in run() - task 0affc7ec-ae25-ec05-55b7-000000000160 22225 1726882757.88251: variable 'ansible_search_path' from source: unknown 22225 1726882757.88254: variable 'ansible_search_path' from source: unknown 22225 1726882757.88285: calling self._execute() 22225 1726882757.88344: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882757.88348: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882757.88357: variable 'omit' from source: magic vars 22225 1726882757.88598: variable 'ansible_distribution_major_version' from source: facts 22225 1726882757.88608: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882757.88750: variable 'type' from source: play vars 22225 1726882757.88753: variable 'state' from source: include params 22225 1726882757.88760: variable 'interface' from source: play vars 22225 1726882757.88763: variable 'current_interfaces' from source: set_fact 22225 1726882757.88770: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 22225 1726882757.88773: when evaluation is False, skipping this task 22225 1726882757.88775: _execute() done 22225 1726882757.88778: dumping result to json 22225 1726882757.88781: done dumping result, returning 22225 1726882757.88790: done running TaskExecutor() for managed_node1/TASK: Delete tap interface veth0 [0affc7ec-ae25-ec05-55b7-000000000160] 22225 1726882757.88795: sending task result for task 0affc7ec-ae25-ec05-55b7-000000000160 22225 1726882757.88882: done sending task result for task 0affc7ec-ae25-ec05-55b7-000000000160 22225 1726882757.88886: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 22225 1726882757.88934: no more pending results, returning what we have 22225 1726882757.88937: results queue empty 22225 1726882757.88938: checking for any_errors_fatal 22225 1726882757.88941: done checking for any_errors_fatal 22225 1726882757.88942: checking for max_fail_percentage 22225 1726882757.88944: done checking for max_fail_percentage 22225 1726882757.88944: checking to see if all hosts have failed and the running result is not ok 22225 1726882757.88945: done checking to see if all hosts have failed 22225 1726882757.88946: getting the remaining hosts for this loop 22225 1726882757.88947: done getting the remaining hosts for this loop 22225 1726882757.88950: getting the next task for host managed_node1 22225 1726882757.88956: done getting next task for host managed_node1 22225 1726882757.88958: ^ task is: TASK: Set up gateway ip on veth peer 22225 1726882757.88961: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882757.88964: getting variables 22225 1726882757.88965: in VariableManager get_vars() 22225 1726882757.89000: Calling all_inventory to load vars for managed_node1 22225 1726882757.89003: Calling groups_inventory to load vars for managed_node1 22225 1726882757.89004: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882757.89011: Calling all_plugins_play to load vars for managed_node1 22225 1726882757.89013: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882757.89015: Calling groups_plugins_play to load vars for managed_node1 22225 1726882757.89134: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882757.89268: done with get_vars() 22225 1726882757.89274: done getting variables 22225 1726882757.89340: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Set up gateway ip on veth peer] ****************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:15 Friday 20 September 2024 21:39:17 -0400 (0:00:00.015) 0:00:13.286 ****** 22225 1726882757.89358: entering _queue_task() for managed_node1/shell 22225 1726882757.89359: Creating lock for shell 22225 1726882757.89526: worker is 1 (out of 1 available) 22225 1726882757.89539: exiting _queue_task() for managed_node1/shell 22225 1726882757.89548: done queuing things up, now waiting for results queue to drain 22225 1726882757.89549: waiting for pending results... 22225 1726882757.89690: running TaskExecutor() for managed_node1/TASK: Set up gateway ip on veth peer 22225 1726882757.89742: in run() - task 0affc7ec-ae25-ec05-55b7-00000000000d 22225 1726882757.89752: variable 'ansible_search_path' from source: unknown 22225 1726882757.89782: calling self._execute() 22225 1726882757.89843: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882757.89847: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882757.89856: variable 'omit' from source: magic vars 22225 1726882757.90152: variable 'ansible_distribution_major_version' from source: facts 22225 1726882757.90160: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882757.90166: variable 'omit' from source: magic vars 22225 1726882757.90189: variable 'omit' from source: magic vars 22225 1726882757.90282: variable 'interface' from source: play vars 22225 1726882757.90298: variable 'omit' from source: magic vars 22225 1726882757.90332: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22225 1726882757.90359: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22225 1726882757.90374: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22225 1726882757.90390: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882757.90399: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882757.90421: variable 'inventory_hostname' from source: host vars for 'managed_node1' 22225 1726882757.90427: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882757.90430: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882757.90498: Set connection var ansible_connection to ssh 22225 1726882757.90507: Set connection var ansible_pipelining to False 22225 1726882757.90515: Set connection var ansible_shell_executable to /bin/sh 22225 1726882757.90520: Set connection var ansible_timeout to 10 22225 1726882757.90526: Set connection var ansible_shell_type to sh 22225 1726882757.90531: Set connection var ansible_module_compression to ZIP_DEFLATED 22225 1726882757.90551: variable 'ansible_shell_executable' from source: unknown 22225 1726882757.90554: variable 'ansible_connection' from source: unknown 22225 1726882757.90557: variable 'ansible_module_compression' from source: unknown 22225 1726882757.90560: variable 'ansible_shell_type' from source: unknown 22225 1726882757.90562: variable 'ansible_shell_executable' from source: unknown 22225 1726882757.90565: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882757.90568: variable 'ansible_pipelining' from source: unknown 22225 1726882757.90572: variable 'ansible_timeout' from source: unknown 22225 1726882757.90576: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882757.90683: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22225 1726882757.90697: variable 'omit' from source: magic vars 22225 1726882757.90702: starting attempt loop 22225 1726882757.90705: running the handler 22225 1726882757.90713: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22225 1726882757.90730: _low_level_execute_command(): starting 22225 1726882757.90736: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22225 1726882757.91265: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882757.91269: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882757.91272: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration <<< 22225 1726882757.91274: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found <<< 22225 1726882757.91277: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882757.91328: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882757.91334: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882757.91345: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882757.91398: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882757.93088: stdout chunk (state=3): >>>/root <<< 22225 1726882757.93196: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882757.93247: stderr chunk (state=3): >>><<< 22225 1726882757.93251: stdout chunk (state=3): >>><<< 22225 1726882757.93271: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882757.93285: _low_level_execute_command(): starting 22225 1726882757.93290: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882757.9326892-22737-142609235482075 `" && echo ansible-tmp-1726882757.9326892-22737-142609235482075="` echo /root/.ansible/tmp/ansible-tmp-1726882757.9326892-22737-142609235482075 `" ) && sleep 0' 22225 1726882757.93751: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882757.93761: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882757.93764: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882757.93766: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882757.93810: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882757.93814: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882757.93873: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882757.95812: stdout chunk (state=3): >>>ansible-tmp-1726882757.9326892-22737-142609235482075=/root/.ansible/tmp/ansible-tmp-1726882757.9326892-22737-142609235482075 <<< 22225 1726882757.95936: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882757.95981: stderr chunk (state=3): >>><<< 22225 1726882757.95986: stdout chunk (state=3): >>><<< 22225 1726882757.96002: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882757.9326892-22737-142609235482075=/root/.ansible/tmp/ansible-tmp-1726882757.9326892-22737-142609235482075 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882757.96028: variable 'ansible_module_compression' from source: unknown 22225 1726882757.96067: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-2222502xecypi/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 22225 1726882757.96102: variable 'ansible_facts' from source: unknown 22225 1726882757.96158: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882757.9326892-22737-142609235482075/AnsiballZ_command.py 22225 1726882757.96260: Sending initial data 22225 1726882757.96263: Sent initial data (156 bytes) 22225 1726882757.96718: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882757.96725: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882757.96728: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882757.96730: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found <<< 22225 1726882757.96734: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882757.96786: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882757.96789: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882757.96839: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882757.98420: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 22225 1726882757.98428: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22225 1726882757.98471: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22225 1726882757.98521: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2222502xecypi/tmp70u0fd91 /root/.ansible/tmp/ansible-tmp-1726882757.9326892-22737-142609235482075/AnsiballZ_command.py <<< 22225 1726882757.98526: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882757.9326892-22737-142609235482075/AnsiballZ_command.py" <<< 22225 1726882757.98574: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-2222502xecypi/tmp70u0fd91" to remote "/root/.ansible/tmp/ansible-tmp-1726882757.9326892-22737-142609235482075/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882757.9326892-22737-142609235482075/AnsiballZ_command.py" <<< 22225 1726882757.99136: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882757.99198: stderr chunk (state=3): >>><<< 22225 1726882757.99202: stdout chunk (state=3): >>><<< 22225 1726882757.99219: done transferring module to remote 22225 1726882757.99231: _low_level_execute_command(): starting 22225 1726882757.99239: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882757.9326892-22737-142609235482075/ /root/.ansible/tmp/ansible-tmp-1726882757.9326892-22737-142609235482075/AnsiballZ_command.py && sleep 0' 22225 1726882757.99685: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882757.99689: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found <<< 22225 1726882757.99691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882757.99697: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882757.99744: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882757.99747: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882757.99802: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882758.01606: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882758.01656: stderr chunk (state=3): >>><<< 22225 1726882758.01659: stdout chunk (state=3): >>><<< 22225 1726882758.01673: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882758.01676: _low_level_execute_command(): starting 22225 1726882758.01684: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882757.9326892-22737-142609235482075/AnsiballZ_command.py && sleep 0' 22225 1726882758.02313: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882758.02337: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882758.02355: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882758.02442: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882758.21036: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": "ip netns add ns1\nip link set peerveth0 netns ns1\nip netns exec ns1 ip -6 addr add 2001:db8::1/32 dev peerveth0\nip netns exec ns1 ip link set peerveth0 up\n", "start": "2024-09-20 21:39:18.186974", "end": "2024-09-20 21:39:18.208310", "delta": "0:00:00.021336", "msg": "", "invocation": {"module_args": {"_raw_params": "ip netns add ns1\nip link set peerveth0 netns ns1\nip netns exec ns1 ip -6 addr add 2001:db8::1/32 dev peerveth0\nip netns exec ns1 ip link set peerveth0 up\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 22225 1726882758.22847: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. <<< 22225 1726882758.22860: stdout chunk (state=3): >>><<< 22225 1726882758.22873: stderr chunk (state=3): >>><<< 22225 1726882758.22902: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": "ip netns add ns1\nip link set peerveth0 netns ns1\nip netns exec ns1 ip -6 addr add 2001:db8::1/32 dev peerveth0\nip netns exec ns1 ip link set peerveth0 up\n", "start": "2024-09-20 21:39:18.186974", "end": "2024-09-20 21:39:18.208310", "delta": "0:00:00.021336", "msg": "", "invocation": {"module_args": {"_raw_params": "ip netns add ns1\nip link set peerveth0 netns ns1\nip netns exec ns1 ip -6 addr add 2001:db8::1/32 dev peerveth0\nip netns exec ns1 ip link set peerveth0 up\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. 22225 1726882758.23033: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip netns add ns1\nip link set peerveth0 netns ns1\nip netns exec ns1 ip -6 addr add 2001:db8::1/32 dev peerveth0\nip netns exec ns1 ip link set peerveth0 up\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882757.9326892-22737-142609235482075/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22225 1726882758.23037: _low_level_execute_command(): starting 22225 1726882758.23040: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882757.9326892-22737-142609235482075/ > /dev/null 2>&1 && sleep 0' 22225 1726882758.23751: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22225 1726882758.23831: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882758.23891: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882758.23912: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882758.23969: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882758.24029: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882758.25895: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882758.25943: stderr chunk (state=3): >>><<< 22225 1726882758.25947: stdout chunk (state=3): >>><<< 22225 1726882758.25962: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882758.25969: handler run complete 22225 1726882758.25993: Evaluated conditional (False): False 22225 1726882758.26026: attempt loop complete, returning result 22225 1726882758.26030: _execute() done 22225 1726882758.26032: dumping result to json 22225 1726882758.26035: done dumping result, returning 22225 1726882758.26038: done running TaskExecutor() for managed_node1/TASK: Set up gateway ip on veth peer [0affc7ec-ae25-ec05-55b7-00000000000d] 22225 1726882758.26040: sending task result for task 0affc7ec-ae25-ec05-55b7-00000000000d 22225 1726882758.26132: done sending task result for task 0affc7ec-ae25-ec05-55b7-00000000000d 22225 1726882758.26134: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": "ip netns add ns1\nip link set peerveth0 netns ns1\nip netns exec ns1 ip -6 addr add 2001:db8::1/32 dev peerveth0\nip netns exec ns1 ip link set peerveth0 up\n", "delta": "0:00:00.021336", "end": "2024-09-20 21:39:18.208310", "rc": 0, "start": "2024-09-20 21:39:18.186974" } 22225 1726882758.26269: no more pending results, returning what we have 22225 1726882758.26273: results queue empty 22225 1726882758.26274: checking for any_errors_fatal 22225 1726882758.26278: done checking for any_errors_fatal 22225 1726882758.26278: checking for max_fail_percentage 22225 1726882758.26282: done checking for max_fail_percentage 22225 1726882758.26283: checking to see if all hosts have failed and the running result is not ok 22225 1726882758.26284: done checking to see if all hosts have failed 22225 1726882758.26285: getting the remaining hosts for this loop 22225 1726882758.26286: done getting the remaining hosts for this loop 22225 1726882758.26289: getting the next task for host managed_node1 22225 1726882758.26295: done getting next task for host managed_node1 22225 1726882758.26297: ^ task is: TASK: TEST: I can configure an interface with static ipv6 config 22225 1726882758.26299: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882758.26302: getting variables 22225 1726882758.26303: in VariableManager get_vars() 22225 1726882758.26339: Calling all_inventory to load vars for managed_node1 22225 1726882758.26342: Calling groups_inventory to load vars for managed_node1 22225 1726882758.26344: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882758.26354: Calling all_plugins_play to load vars for managed_node1 22225 1726882758.26357: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882758.26365: Calling groups_plugins_play to load vars for managed_node1 22225 1726882758.26548: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882758.26831: done with get_vars() 22225 1726882758.26845: done getting variables 22225 1726882758.26910: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [TEST: I can configure an interface with static ipv6 config] ************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:27 Friday 20 September 2024 21:39:18 -0400 (0:00:00.375) 0:00:13.662 ****** 22225 1726882758.26947: entering _queue_task() for managed_node1/debug 22225 1726882758.27360: worker is 1 (out of 1 available) 22225 1726882758.27373: exiting _queue_task() for managed_node1/debug 22225 1726882758.27386: done queuing things up, now waiting for results queue to drain 22225 1726882758.27387: waiting for pending results... 22225 1726882758.27575: running TaskExecutor() for managed_node1/TASK: TEST: I can configure an interface with static ipv6 config 22225 1726882758.27616: in run() - task 0affc7ec-ae25-ec05-55b7-00000000000f 22225 1726882758.27632: variable 'ansible_search_path' from source: unknown 22225 1726882758.27668: calling self._execute() 22225 1726882758.27750: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882758.27757: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882758.27765: variable 'omit' from source: magic vars 22225 1726882758.28050: variable 'ansible_distribution_major_version' from source: facts 22225 1726882758.28060: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882758.28065: variable 'omit' from source: magic vars 22225 1726882758.28128: variable 'omit' from source: magic vars 22225 1726882758.28132: variable 'omit' from source: magic vars 22225 1726882758.28144: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22225 1726882758.28174: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22225 1726882758.28193: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22225 1726882758.28206: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882758.28216: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882758.28244: variable 'inventory_hostname' from source: host vars for 'managed_node1' 22225 1726882758.28247: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882758.28250: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882758.28320: Set connection var ansible_connection to ssh 22225 1726882758.28331: Set connection var ansible_pipelining to False 22225 1726882758.28338: Set connection var ansible_shell_executable to /bin/sh 22225 1726882758.28344: Set connection var ansible_timeout to 10 22225 1726882758.28346: Set connection var ansible_shell_type to sh 22225 1726882758.28352: Set connection var ansible_module_compression to ZIP_DEFLATED 22225 1726882758.28375: variable 'ansible_shell_executable' from source: unknown 22225 1726882758.28378: variable 'ansible_connection' from source: unknown 22225 1726882758.28383: variable 'ansible_module_compression' from source: unknown 22225 1726882758.28386: variable 'ansible_shell_type' from source: unknown 22225 1726882758.28389: variable 'ansible_shell_executable' from source: unknown 22225 1726882758.28391: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882758.28393: variable 'ansible_pipelining' from source: unknown 22225 1726882758.28397: variable 'ansible_timeout' from source: unknown 22225 1726882758.28399: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882758.28512: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22225 1726882758.28523: variable 'omit' from source: magic vars 22225 1726882758.28530: starting attempt loop 22225 1726882758.28533: running the handler 22225 1726882758.28569: handler run complete 22225 1726882758.28584: attempt loop complete, returning result 22225 1726882758.28589: _execute() done 22225 1726882758.28596: dumping result to json 22225 1726882758.28598: done dumping result, returning 22225 1726882758.28605: done running TaskExecutor() for managed_node1/TASK: TEST: I can configure an interface with static ipv6 config [0affc7ec-ae25-ec05-55b7-00000000000f] 22225 1726882758.28610: sending task result for task 0affc7ec-ae25-ec05-55b7-00000000000f 22225 1726882758.28698: done sending task result for task 0affc7ec-ae25-ec05-55b7-00000000000f 22225 1726882758.28702: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: ################################################## 22225 1726882758.28755: no more pending results, returning what we have 22225 1726882758.28759: results queue empty 22225 1726882758.28760: checking for any_errors_fatal 22225 1726882758.28769: done checking for any_errors_fatal 22225 1726882758.28770: checking for max_fail_percentage 22225 1726882758.28771: done checking for max_fail_percentage 22225 1726882758.28772: checking to see if all hosts have failed and the running result is not ok 22225 1726882758.28773: done checking to see if all hosts have failed 22225 1726882758.28774: getting the remaining hosts for this loop 22225 1726882758.28776: done getting the remaining hosts for this loop 22225 1726882758.28780: getting the next task for host managed_node1 22225 1726882758.28787: done getting next task for host managed_node1 22225 1726882758.28793: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 22225 1726882758.28796: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882758.28810: getting variables 22225 1726882758.28811: in VariableManager get_vars() 22225 1726882758.28851: Calling all_inventory to load vars for managed_node1 22225 1726882758.28854: Calling groups_inventory to load vars for managed_node1 22225 1726882758.28856: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882758.28864: Calling all_plugins_play to load vars for managed_node1 22225 1726882758.28867: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882758.28869: Calling groups_plugins_play to load vars for managed_node1 22225 1726882758.29008: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882758.29340: done with get_vars() 22225 1726882758.29348: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:39:18 -0400 (0:00:00.024) 0:00:13.686 ****** 22225 1726882758.29419: entering _queue_task() for managed_node1/include_tasks 22225 1726882758.29661: worker is 1 (out of 1 available) 22225 1726882758.29675: exiting _queue_task() for managed_node1/include_tasks 22225 1726882758.29689: done queuing things up, now waiting for results queue to drain 22225 1726882758.29691: waiting for pending results... 22225 1726882758.30040: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 22225 1726882758.30046: in run() - task 0affc7ec-ae25-ec05-55b7-000000000017 22225 1726882758.30049: variable 'ansible_search_path' from source: unknown 22225 1726882758.30052: variable 'ansible_search_path' from source: unknown 22225 1726882758.30094: calling self._execute() 22225 1726882758.30185: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882758.30196: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882758.30209: variable 'omit' from source: magic vars 22225 1726882758.30661: variable 'ansible_distribution_major_version' from source: facts 22225 1726882758.30698: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882758.30710: _execute() done 22225 1726882758.30719: dumping result to json 22225 1726882758.30728: done dumping result, returning 22225 1726882758.30740: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affc7ec-ae25-ec05-55b7-000000000017] 22225 1726882758.30751: sending task result for task 0affc7ec-ae25-ec05-55b7-000000000017 22225 1726882758.30933: no more pending results, returning what we have 22225 1726882758.30938: in VariableManager get_vars() 22225 1726882758.30992: Calling all_inventory to load vars for managed_node1 22225 1726882758.30995: Calling groups_inventory to load vars for managed_node1 22225 1726882758.30997: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882758.31012: Calling all_plugins_play to load vars for managed_node1 22225 1726882758.31014: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882758.31017: Calling groups_plugins_play to load vars for managed_node1 22225 1726882758.31359: done sending task result for task 0affc7ec-ae25-ec05-55b7-000000000017 22225 1726882758.31363: WORKER PROCESS EXITING 22225 1726882758.31393: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882758.31578: done with get_vars() 22225 1726882758.31585: variable 'ansible_search_path' from source: unknown 22225 1726882758.31586: variable 'ansible_search_path' from source: unknown 22225 1726882758.31614: we have included files to process 22225 1726882758.31615: generating all_blocks data 22225 1726882758.31616: done generating all_blocks data 22225 1726882758.31619: processing included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 22225 1726882758.31620: loading included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 22225 1726882758.31624: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 22225 1726882758.32162: done processing included file 22225 1726882758.32164: iterating over new_blocks loaded from include file 22225 1726882758.32165: in VariableManager get_vars() 22225 1726882758.32181: done with get_vars() 22225 1726882758.32182: filtering new block on tags 22225 1726882758.32194: done filtering new block on tags 22225 1726882758.32196: in VariableManager get_vars() 22225 1726882758.32212: done with get_vars() 22225 1726882758.32213: filtering new block on tags 22225 1726882758.32230: done filtering new block on tags 22225 1726882758.32232: in VariableManager get_vars() 22225 1726882758.32247: done with get_vars() 22225 1726882758.32248: filtering new block on tags 22225 1726882758.32259: done filtering new block on tags 22225 1726882758.32261: done iterating over new_blocks loaded from include file included: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node1 22225 1726882758.32264: extending task lists for all hosts with included blocks 22225 1726882758.32800: done extending task lists 22225 1726882758.32801: done processing included files 22225 1726882758.32802: results queue empty 22225 1726882758.32802: checking for any_errors_fatal 22225 1726882758.32804: done checking for any_errors_fatal 22225 1726882758.32805: checking for max_fail_percentage 22225 1726882758.32806: done checking for max_fail_percentage 22225 1726882758.32806: checking to see if all hosts have failed and the running result is not ok 22225 1726882758.32807: done checking to see if all hosts have failed 22225 1726882758.32807: getting the remaining hosts for this loop 22225 1726882758.32808: done getting the remaining hosts for this loop 22225 1726882758.32810: getting the next task for host managed_node1 22225 1726882758.32813: done getting next task for host managed_node1 22225 1726882758.32815: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 22225 1726882758.32817: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882758.32825: getting variables 22225 1726882758.32826: in VariableManager get_vars() 22225 1726882758.32837: Calling all_inventory to load vars for managed_node1 22225 1726882758.32839: Calling groups_inventory to load vars for managed_node1 22225 1726882758.32840: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882758.32844: Calling all_plugins_play to load vars for managed_node1 22225 1726882758.32846: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882758.32847: Calling groups_plugins_play to load vars for managed_node1 22225 1726882758.32962: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882758.33130: done with get_vars() 22225 1726882758.33137: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:39:18 -0400 (0:00:00.037) 0:00:13.724 ****** 22225 1726882758.33202: entering _queue_task() for managed_node1/setup 22225 1726882758.33459: worker is 1 (out of 1 available) 22225 1726882758.33475: exiting _queue_task() for managed_node1/setup 22225 1726882758.33486: done queuing things up, now waiting for results queue to drain 22225 1726882758.33488: waiting for pending results... 22225 1726882758.33839: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 22225 1726882758.33853: in run() - task 0affc7ec-ae25-ec05-55b7-0000000001fc 22225 1726882758.33877: variable 'ansible_search_path' from source: unknown 22225 1726882758.33887: variable 'ansible_search_path' from source: unknown 22225 1726882758.33936: calling self._execute() 22225 1726882758.34029: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882758.34041: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882758.34061: variable 'omit' from source: magic vars 22225 1726882758.34453: variable 'ansible_distribution_major_version' from source: facts 22225 1726882758.34470: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882758.34712: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22225 1726882758.36840: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22225 1726882758.36899: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22225 1726882758.36929: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22225 1726882758.36955: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22225 1726882758.36979: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22225 1726882758.37043: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22225 1726882758.37064: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22225 1726882758.37084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22225 1726882758.37118: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22225 1726882758.37131: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22225 1726882758.37172: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22225 1726882758.37193: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22225 1726882758.37212: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22225 1726882758.37242: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22225 1726882758.37253: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22225 1726882758.37368: variable '__network_required_facts' from source: role '' defaults 22225 1726882758.37375: variable 'ansible_facts' from source: unknown 22225 1726882758.37448: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 22225 1726882758.37452: when evaluation is False, skipping this task 22225 1726882758.37455: _execute() done 22225 1726882758.37457: dumping result to json 22225 1726882758.37460: done dumping result, returning 22225 1726882758.37466: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affc7ec-ae25-ec05-55b7-0000000001fc] 22225 1726882758.37471: sending task result for task 0affc7ec-ae25-ec05-55b7-0000000001fc skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 22225 1726882758.37616: no more pending results, returning what we have 22225 1726882758.37621: results queue empty 22225 1726882758.37624: checking for any_errors_fatal 22225 1726882758.37625: done checking for any_errors_fatal 22225 1726882758.37626: checking for max_fail_percentage 22225 1726882758.37627: done checking for max_fail_percentage 22225 1726882758.37628: checking to see if all hosts have failed and the running result is not ok 22225 1726882758.37629: done checking to see if all hosts have failed 22225 1726882758.37630: getting the remaining hosts for this loop 22225 1726882758.37631: done getting the remaining hosts for this loop 22225 1726882758.37638: getting the next task for host managed_node1 22225 1726882758.37648: done getting next task for host managed_node1 22225 1726882758.37652: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 22225 1726882758.37656: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882758.37671: getting variables 22225 1726882758.37672: in VariableManager get_vars() 22225 1726882758.37715: Calling all_inventory to load vars for managed_node1 22225 1726882758.37718: Calling groups_inventory to load vars for managed_node1 22225 1726882758.37720: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882758.37732: Calling all_plugins_play to load vars for managed_node1 22225 1726882758.37734: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882758.37737: Calling groups_plugins_play to load vars for managed_node1 22225 1726882758.37889: done sending task result for task 0affc7ec-ae25-ec05-55b7-0000000001fc 22225 1726882758.37893: WORKER PROCESS EXITING 22225 1726882758.37904: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882758.38061: done with get_vars() 22225 1726882758.38070: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:39:18 -0400 (0:00:00.049) 0:00:13.774 ****** 22225 1726882758.38147: entering _queue_task() for managed_node1/stat 22225 1726882758.38413: worker is 1 (out of 1 available) 22225 1726882758.38430: exiting _queue_task() for managed_node1/stat 22225 1726882758.38444: done queuing things up, now waiting for results queue to drain 22225 1726882758.38446: waiting for pending results... 22225 1726882758.38844: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 22225 1726882758.38877: in run() - task 0affc7ec-ae25-ec05-55b7-0000000001fe 22225 1726882758.38901: variable 'ansible_search_path' from source: unknown 22225 1726882758.38909: variable 'ansible_search_path' from source: unknown 22225 1726882758.38956: calling self._execute() 22225 1726882758.39055: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882758.39068: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882758.39083: variable 'omit' from source: magic vars 22225 1726882758.39467: variable 'ansible_distribution_major_version' from source: facts 22225 1726882758.39477: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882758.39602: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22225 1726882758.39871: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22225 1726882758.39906: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22225 1726882758.39935: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22225 1726882758.39961: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22225 1726882758.40030: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22225 1726882758.40052: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22225 1726882758.40071: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22225 1726882758.40091: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22225 1726882758.40158: variable '__network_is_ostree' from source: set_fact 22225 1726882758.40167: Evaluated conditional (not __network_is_ostree is defined): False 22225 1726882758.40170: when evaluation is False, skipping this task 22225 1726882758.40172: _execute() done 22225 1726882758.40175: dumping result to json 22225 1726882758.40177: done dumping result, returning 22225 1726882758.40185: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affc7ec-ae25-ec05-55b7-0000000001fe] 22225 1726882758.40188: sending task result for task 0affc7ec-ae25-ec05-55b7-0000000001fe 22225 1726882758.40277: done sending task result for task 0affc7ec-ae25-ec05-55b7-0000000001fe 22225 1726882758.40282: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 22225 1726882758.40334: no more pending results, returning what we have 22225 1726882758.40338: results queue empty 22225 1726882758.40339: checking for any_errors_fatal 22225 1726882758.40345: done checking for any_errors_fatal 22225 1726882758.40346: checking for max_fail_percentage 22225 1726882758.40348: done checking for max_fail_percentage 22225 1726882758.40349: checking to see if all hosts have failed and the running result is not ok 22225 1726882758.40350: done checking to see if all hosts have failed 22225 1726882758.40351: getting the remaining hosts for this loop 22225 1726882758.40352: done getting the remaining hosts for this loop 22225 1726882758.40356: getting the next task for host managed_node1 22225 1726882758.40363: done getting next task for host managed_node1 22225 1726882758.40366: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 22225 1726882758.40370: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882758.40386: getting variables 22225 1726882758.40387: in VariableManager get_vars() 22225 1726882758.40425: Calling all_inventory to load vars for managed_node1 22225 1726882758.40428: Calling groups_inventory to load vars for managed_node1 22225 1726882758.40430: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882758.40439: Calling all_plugins_play to load vars for managed_node1 22225 1726882758.40442: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882758.40445: Calling groups_plugins_play to load vars for managed_node1 22225 1726882758.40617: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882758.40765: done with get_vars() 22225 1726882758.40773: done getting variables 22225 1726882758.40818: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:39:18 -0400 (0:00:00.027) 0:00:13.801 ****** 22225 1726882758.40848: entering _queue_task() for managed_node1/set_fact 22225 1726882758.41056: worker is 1 (out of 1 available) 22225 1726882758.41071: exiting _queue_task() for managed_node1/set_fact 22225 1726882758.41085: done queuing things up, now waiting for results queue to drain 22225 1726882758.41087: waiting for pending results... 22225 1726882758.41245: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 22225 1726882758.41347: in run() - task 0affc7ec-ae25-ec05-55b7-0000000001ff 22225 1726882758.41359: variable 'ansible_search_path' from source: unknown 22225 1726882758.41362: variable 'ansible_search_path' from source: unknown 22225 1726882758.41392: calling self._execute() 22225 1726882758.41460: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882758.41465: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882758.41474: variable 'omit' from source: magic vars 22225 1726882758.41753: variable 'ansible_distribution_major_version' from source: facts 22225 1726882758.41761: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882758.41885: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22225 1726882758.42092: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22225 1726882758.42126: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22225 1726882758.42153: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22225 1726882758.42183: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22225 1726882758.42250: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22225 1726882758.42268: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22225 1726882758.42288: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22225 1726882758.42312: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22225 1726882758.42376: variable '__network_is_ostree' from source: set_fact 22225 1726882758.42385: Evaluated conditional (not __network_is_ostree is defined): False 22225 1726882758.42388: when evaluation is False, skipping this task 22225 1726882758.42390: _execute() done 22225 1726882758.42393: dumping result to json 22225 1726882758.42395: done dumping result, returning 22225 1726882758.42404: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affc7ec-ae25-ec05-55b7-0000000001ff] 22225 1726882758.42407: sending task result for task 0affc7ec-ae25-ec05-55b7-0000000001ff 22225 1726882758.42496: done sending task result for task 0affc7ec-ae25-ec05-55b7-0000000001ff 22225 1726882758.42499: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 22225 1726882758.42569: no more pending results, returning what we have 22225 1726882758.42573: results queue empty 22225 1726882758.42573: checking for any_errors_fatal 22225 1726882758.42578: done checking for any_errors_fatal 22225 1726882758.42579: checking for max_fail_percentage 22225 1726882758.42583: done checking for max_fail_percentage 22225 1726882758.42584: checking to see if all hosts have failed and the running result is not ok 22225 1726882758.42585: done checking to see if all hosts have failed 22225 1726882758.42585: getting the remaining hosts for this loop 22225 1726882758.42587: done getting the remaining hosts for this loop 22225 1726882758.42590: getting the next task for host managed_node1 22225 1726882758.42598: done getting next task for host managed_node1 22225 1726882758.42602: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 22225 1726882758.42605: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882758.42619: getting variables 22225 1726882758.42620: in VariableManager get_vars() 22225 1726882758.42664: Calling all_inventory to load vars for managed_node1 22225 1726882758.42667: Calling groups_inventory to load vars for managed_node1 22225 1726882758.42669: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882758.42678: Calling all_plugins_play to load vars for managed_node1 22225 1726882758.42682: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882758.42685: Calling groups_plugins_play to load vars for managed_node1 22225 1726882758.42828: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882758.42983: done with get_vars() 22225 1726882758.42991: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:39:18 -0400 (0:00:00.022) 0:00:13.823 ****** 22225 1726882758.43059: entering _queue_task() for managed_node1/service_facts 22225 1726882758.43060: Creating lock for service_facts 22225 1726882758.43275: worker is 1 (out of 1 available) 22225 1726882758.43293: exiting _queue_task() for managed_node1/service_facts 22225 1726882758.43304: done queuing things up, now waiting for results queue to drain 22225 1726882758.43306: waiting for pending results... 22225 1726882758.43468: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running 22225 1726882758.43562: in run() - task 0affc7ec-ae25-ec05-55b7-000000000201 22225 1726882758.43574: variable 'ansible_search_path' from source: unknown 22225 1726882758.43579: variable 'ansible_search_path' from source: unknown 22225 1726882758.43609: calling self._execute() 22225 1726882758.43678: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882758.43684: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882758.43692: variable 'omit' from source: magic vars 22225 1726882758.44015: variable 'ansible_distribution_major_version' from source: facts 22225 1726882758.44026: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882758.44032: variable 'omit' from source: magic vars 22225 1726882758.44090: variable 'omit' from source: magic vars 22225 1726882758.44110: variable 'omit' from source: magic vars 22225 1726882758.44143: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22225 1726882758.44170: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22225 1726882758.44187: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22225 1726882758.44204: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882758.44213: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882758.44239: variable 'inventory_hostname' from source: host vars for 'managed_node1' 22225 1726882758.44242: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882758.44245: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882758.44315: Set connection var ansible_connection to ssh 22225 1726882758.44325: Set connection var ansible_pipelining to False 22225 1726882758.44333: Set connection var ansible_shell_executable to /bin/sh 22225 1726882758.44339: Set connection var ansible_timeout to 10 22225 1726882758.44341: Set connection var ansible_shell_type to sh 22225 1726882758.44347: Set connection var ansible_module_compression to ZIP_DEFLATED 22225 1726882758.44366: variable 'ansible_shell_executable' from source: unknown 22225 1726882758.44369: variable 'ansible_connection' from source: unknown 22225 1726882758.44372: variable 'ansible_module_compression' from source: unknown 22225 1726882758.44374: variable 'ansible_shell_type' from source: unknown 22225 1726882758.44376: variable 'ansible_shell_executable' from source: unknown 22225 1726882758.44379: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882758.44384: variable 'ansible_pipelining' from source: unknown 22225 1726882758.44386: variable 'ansible_timeout' from source: unknown 22225 1726882758.44390: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882758.44539: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 22225 1726882758.44548: variable 'omit' from source: magic vars 22225 1726882758.44553: starting attempt loop 22225 1726882758.44556: running the handler 22225 1726882758.44568: _low_level_execute_command(): starting 22225 1726882758.44574: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22225 1726882758.45115: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882758.45118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882758.45123: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882758.45126: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882758.45128: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882758.45185: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882758.45188: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882758.45196: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882758.45252: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882758.46990: stdout chunk (state=3): >>>/root <<< 22225 1726882758.47097: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882758.47154: stderr chunk (state=3): >>><<< 22225 1726882758.47158: stdout chunk (state=3): >>><<< 22225 1726882758.47176: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882758.47190: _low_level_execute_command(): starting 22225 1726882758.47196: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882758.4717538-22759-196313957828757 `" && echo ansible-tmp-1726882758.4717538-22759-196313957828757="` echo /root/.ansible/tmp/ansible-tmp-1726882758.4717538-22759-196313957828757 `" ) && sleep 0' 22225 1726882758.47660: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882758.47665: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found <<< 22225 1726882758.47668: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882758.47677: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882758.47682: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882758.47730: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882758.47735: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882758.47737: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882758.47786: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882758.49759: stdout chunk (state=3): >>>ansible-tmp-1726882758.4717538-22759-196313957828757=/root/.ansible/tmp/ansible-tmp-1726882758.4717538-22759-196313957828757 <<< 22225 1726882758.49876: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882758.49931: stderr chunk (state=3): >>><<< 22225 1726882758.49935: stdout chunk (state=3): >>><<< 22225 1726882758.49948: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882758.4717538-22759-196313957828757=/root/.ansible/tmp/ansible-tmp-1726882758.4717538-22759-196313957828757 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882758.49991: variable 'ansible_module_compression' from source: unknown 22225 1726882758.50026: ANSIBALLZ: Using lock for service_facts 22225 1726882758.50030: ANSIBALLZ: Acquiring lock 22225 1726882758.50033: ANSIBALLZ: Lock acquired: 140272891387616 22225 1726882758.50038: ANSIBALLZ: Creating module 22225 1726882758.59180: ANSIBALLZ: Writing module into payload 22225 1726882758.59254: ANSIBALLZ: Writing module 22225 1726882758.59271: ANSIBALLZ: Renaming module 22225 1726882758.59277: ANSIBALLZ: Done creating module 22225 1726882758.59296: variable 'ansible_facts' from source: unknown 22225 1726882758.59347: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882758.4717538-22759-196313957828757/AnsiballZ_service_facts.py 22225 1726882758.59448: Sending initial data 22225 1726882758.59452: Sent initial data (162 bytes) 22225 1726882758.59940: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882758.59944: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found <<< 22225 1726882758.59946: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882758.59949: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882758.59951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882758.59998: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882758.60004: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882758.60025: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882758.60074: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882758.61770: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 22225 1726882758.61773: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22225 1726882758.61815: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22225 1726882758.61865: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2222502xecypi/tmp6lqt3d04 /root/.ansible/tmp/ansible-tmp-1726882758.4717538-22759-196313957828757/AnsiballZ_service_facts.py <<< 22225 1726882758.61874: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882758.4717538-22759-196313957828757/AnsiballZ_service_facts.py" <<< 22225 1726882758.61918: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-2222502xecypi/tmp6lqt3d04" to remote "/root/.ansible/tmp/ansible-tmp-1726882758.4717538-22759-196313957828757/AnsiballZ_service_facts.py" <<< 22225 1726882758.61921: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882758.4717538-22759-196313957828757/AnsiballZ_service_facts.py" <<< 22225 1726882758.62519: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882758.62588: stderr chunk (state=3): >>><<< 22225 1726882758.62592: stdout chunk (state=3): >>><<< 22225 1726882758.62613: done transferring module to remote 22225 1726882758.62625: _low_level_execute_command(): starting 22225 1726882758.62630: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882758.4717538-22759-196313957828757/ /root/.ansible/tmp/ansible-tmp-1726882758.4717538-22759-196313957828757/AnsiballZ_service_facts.py && sleep 0' 22225 1726882758.63108: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882758.63112: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found <<< 22225 1726882758.63118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882758.63120: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882758.63125: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882758.63127: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882758.63172: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882758.63179: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882758.63229: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882758.65026: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882758.65073: stderr chunk (state=3): >>><<< 22225 1726882758.65076: stdout chunk (state=3): >>><<< 22225 1726882758.65092: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882758.65095: _low_level_execute_command(): starting 22225 1726882758.65100: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882758.4717538-22759-196313957828757/AnsiballZ_service_facts.py && sleep 0' 22225 1726882758.65556: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882758.65560: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found <<< 22225 1726882758.65562: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration <<< 22225 1726882758.65565: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882758.65567: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882758.65620: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882758.65629: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882758.65685: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882760.83176: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.<<< 22225 1726882760.83191: stdout chunk (state=3): >>>service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 22225 1726882760.84852: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. <<< 22225 1726882760.84928: stderr chunk (state=3): >>><<< 22225 1726882760.84939: stdout chunk (state=3): >>><<< 22225 1726882760.85329: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. 22225 1726882760.86862: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882758.4717538-22759-196313957828757/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22225 1726882760.86866: _low_level_execute_command(): starting 22225 1726882760.86874: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882758.4717538-22759-196313957828757/ > /dev/null 2>&1 && sleep 0' 22225 1726882760.88078: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882760.88082: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found <<< 22225 1726882760.88085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 22225 1726882760.88087: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882760.88089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882760.88252: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882760.88439: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882760.88467: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882760.90376: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882760.90500: stderr chunk (state=3): >>><<< 22225 1726882760.90513: stdout chunk (state=3): >>><<< 22225 1726882760.90828: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882760.90832: handler run complete 22225 1726882760.91086: variable 'ansible_facts' from source: unknown 22225 1726882760.95129: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882760.96057: variable 'ansible_facts' from source: unknown 22225 1726882760.96380: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882760.96994: attempt loop complete, returning result 22225 1726882760.97036: _execute() done 22225 1726882760.97044: dumping result to json 22225 1726882760.97159: done dumping result, returning 22225 1726882760.97176: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running [0affc7ec-ae25-ec05-55b7-000000000201] 22225 1726882760.97197: sending task result for task 0affc7ec-ae25-ec05-55b7-000000000201 ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 22225 1726882760.98878: no more pending results, returning what we have 22225 1726882760.98880: results queue empty 22225 1726882760.98881: checking for any_errors_fatal 22225 1726882760.98884: done checking for any_errors_fatal 22225 1726882760.98885: checking for max_fail_percentage 22225 1726882760.98887: done checking for max_fail_percentage 22225 1726882760.98888: checking to see if all hosts have failed and the running result is not ok 22225 1726882760.98888: done checking to see if all hosts have failed 22225 1726882760.98889: getting the remaining hosts for this loop 22225 1726882760.98890: done getting the remaining hosts for this loop 22225 1726882760.98894: getting the next task for host managed_node1 22225 1726882760.98899: done getting next task for host managed_node1 22225 1726882760.98902: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 22225 1726882760.98906: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882760.98915: getting variables 22225 1726882760.98916: in VariableManager get_vars() 22225 1726882760.98952: Calling all_inventory to load vars for managed_node1 22225 1726882760.98955: Calling groups_inventory to load vars for managed_node1 22225 1726882760.98957: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882760.98969: done sending task result for task 0affc7ec-ae25-ec05-55b7-000000000201 22225 1726882760.98972: WORKER PROCESS EXITING 22225 1726882760.99005: Calling all_plugins_play to load vars for managed_node1 22225 1726882760.99008: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882760.99012: Calling groups_plugins_play to load vars for managed_node1 22225 1726882760.99863: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882761.01085: done with get_vars() 22225 1726882761.01099: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:39:21 -0400 (0:00:02.581) 0:00:16.404 ****** 22225 1726882761.01214: entering _queue_task() for managed_node1/package_facts 22225 1726882761.01216: Creating lock for package_facts 22225 1726882761.01507: worker is 1 (out of 1 available) 22225 1726882761.01520: exiting _queue_task() for managed_node1/package_facts 22225 1726882761.01732: done queuing things up, now waiting for results queue to drain 22225 1726882761.01734: waiting for pending results... 22225 1726882761.01809: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 22225 1726882761.02067: in run() - task 0affc7ec-ae25-ec05-55b7-000000000202 22225 1726882761.02072: variable 'ansible_search_path' from source: unknown 22225 1726882761.02075: variable 'ansible_search_path' from source: unknown 22225 1726882761.02078: calling self._execute() 22225 1726882761.02129: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882761.02142: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882761.02157: variable 'omit' from source: magic vars 22225 1726882761.02535: variable 'ansible_distribution_major_version' from source: facts 22225 1726882761.02552: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882761.02562: variable 'omit' from source: magic vars 22225 1726882761.02644: variable 'omit' from source: magic vars 22225 1726882761.02687: variable 'omit' from source: magic vars 22225 1726882761.02742: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22225 1726882761.02787: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22225 1726882761.02814: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22225 1726882761.02913: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882761.02916: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882761.02919: variable 'inventory_hostname' from source: host vars for 'managed_node1' 22225 1726882761.02921: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882761.02926: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882761.03328: Set connection var ansible_connection to ssh 22225 1726882761.03332: Set connection var ansible_pipelining to False 22225 1726882761.03335: Set connection var ansible_shell_executable to /bin/sh 22225 1726882761.03348: Set connection var ansible_timeout to 10 22225 1726882761.03356: Set connection var ansible_shell_type to sh 22225 1726882761.03369: Set connection var ansible_module_compression to ZIP_DEFLATED 22225 1726882761.03438: variable 'ansible_shell_executable' from source: unknown 22225 1726882761.03528: variable 'ansible_connection' from source: unknown 22225 1726882761.03532: variable 'ansible_module_compression' from source: unknown 22225 1726882761.03535: variable 'ansible_shell_type' from source: unknown 22225 1726882761.03537: variable 'ansible_shell_executable' from source: unknown 22225 1726882761.03540: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882761.03542: variable 'ansible_pipelining' from source: unknown 22225 1726882761.03544: variable 'ansible_timeout' from source: unknown 22225 1726882761.03546: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882761.04061: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 22225 1726882761.04065: variable 'omit' from source: magic vars 22225 1726882761.04068: starting attempt loop 22225 1726882761.04070: running the handler 22225 1726882761.04073: _low_level_execute_command(): starting 22225 1726882761.04075: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22225 1726882761.05303: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882761.05405: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882761.05541: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882761.05596: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882761.07337: stdout chunk (state=3): >>>/root <<< 22225 1726882761.07564: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882761.07567: stdout chunk (state=3): >>><<< 22225 1726882761.07570: stderr chunk (state=3): >>><<< 22225 1726882761.07602: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882761.07653: _low_level_execute_command(): starting 22225 1726882761.07665: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882761.0763898-22808-77582411621320 `" && echo ansible-tmp-1726882761.0763898-22808-77582411621320="` echo /root/.ansible/tmp/ansible-tmp-1726882761.0763898-22808-77582411621320 `" ) && sleep 0' 22225 1726882761.08881: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22225 1726882761.08898: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882761.08918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882761.09089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882761.09131: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882761.09224: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882761.11193: stdout chunk (state=3): >>>ansible-tmp-1726882761.0763898-22808-77582411621320=/root/.ansible/tmp/ansible-tmp-1726882761.0763898-22808-77582411621320 <<< 22225 1726882761.11366: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882761.11395: stdout chunk (state=3): >>><<< 22225 1726882761.11399: stderr chunk (state=3): >>><<< 22225 1726882761.11416: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882761.0763898-22808-77582411621320=/root/.ansible/tmp/ansible-tmp-1726882761.0763898-22808-77582411621320 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882761.11627: variable 'ansible_module_compression' from source: unknown 22225 1726882761.11631: ANSIBALLZ: Using lock for package_facts 22225 1726882761.11634: ANSIBALLZ: Acquiring lock 22225 1726882761.11636: ANSIBALLZ: Lock acquired: 140272926788064 22225 1726882761.11639: ANSIBALLZ: Creating module 22225 1726882761.49168: ANSIBALLZ: Writing module into payload 22225 1726882761.49496: ANSIBALLZ: Writing module 22225 1726882761.49500: ANSIBALLZ: Renaming module 22225 1726882761.49503: ANSIBALLZ: Done creating module 22225 1726882761.49645: variable 'ansible_facts' from source: unknown 22225 1726882761.50127: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882761.0763898-22808-77582411621320/AnsiballZ_package_facts.py 22225 1726882761.50392: Sending initial data 22225 1726882761.50395: Sent initial data (161 bytes) 22225 1726882761.51834: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882761.51859: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882761.51874: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882761.51963: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882761.53851: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22225 1726882761.53900: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22225 1726882761.53961: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2222502xecypi/tmpc8tx0i7i /root/.ansible/tmp/ansible-tmp-1726882761.0763898-22808-77582411621320/AnsiballZ_package_facts.py <<< 22225 1726882761.53965: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882761.0763898-22808-77582411621320/AnsiballZ_package_facts.py" <<< 22225 1726882761.54010: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-2222502xecypi/tmpc8tx0i7i" to remote "/root/.ansible/tmp/ansible-tmp-1726882761.0763898-22808-77582411621320/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882761.0763898-22808-77582411621320/AnsiballZ_package_facts.py" <<< 22225 1726882761.56874: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882761.56980: stderr chunk (state=3): >>><<< 22225 1726882761.56986: stdout chunk (state=3): >>><<< 22225 1726882761.57014: done transferring module to remote 22225 1726882761.57229: _low_level_execute_command(): starting 22225 1726882761.57237: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882761.0763898-22808-77582411621320/ /root/.ansible/tmp/ansible-tmp-1726882761.0763898-22808-77582411621320/AnsiballZ_package_facts.py && sleep 0' 22225 1726882761.58460: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22225 1726882761.58469: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882761.58480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882761.58498: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882761.58513: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 <<< 22225 1726882761.58516: stderr chunk (state=3): >>>debug2: match not found <<< 22225 1726882761.58528: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882761.58542: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 22225 1726882761.58550: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.7 is address <<< 22225 1726882761.58557: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 22225 1726882761.58565: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882761.58652: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882761.58840: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882761.58952: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882761.60800: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882761.61003: stderr chunk (state=3): >>><<< 22225 1726882761.61007: stdout chunk (state=3): >>><<< 22225 1726882761.61094: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882761.61098: _low_level_execute_command(): starting 22225 1726882761.61102: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882761.0763898-22808-77582411621320/AnsiballZ_package_facts.py && sleep 0' 22225 1726882761.62808: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882761.62812: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found <<< 22225 1726882761.62994: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882761.62999: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882761.63002: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882761.63004: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882761.63007: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882761.63168: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882761.63219: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882762.26437: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"na<<< 22225 1726882762.26458: stdout chunk (state=3): >>>me": "nss-util", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "<<< 22225 1726882762.26658: stdout chunk (state=3): >>>systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], <<< 22225 1726882762.26704: stdout chunk (state=3): >>>"perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "sour<<< 22225 1726882762.26711: stdout chunk (state=3): >>>ce": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.11", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 22225 1726882762.28532: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. <<< 22225 1726882762.28535: stdout chunk (state=3): >>><<< 22225 1726882762.28538: stderr chunk (state=3): >>><<< 22225 1726882762.28566: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.11", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. 22225 1726882762.30648: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882761.0763898-22808-77582411621320/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22225 1726882762.30668: _low_level_execute_command(): starting 22225 1726882762.30677: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882761.0763898-22808-77582411621320/ > /dev/null 2>&1 && sleep 0' 22225 1726882762.31355: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882762.31358: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882762.31361: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882762.31364: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882762.31366: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration <<< 22225 1726882762.31368: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882762.31370: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882762.31425: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882762.31432: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882762.31436: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882762.31489: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882762.33485: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882762.33502: stderr chunk (state=3): >>><<< 22225 1726882762.33519: stdout chunk (state=3): >>><<< 22225 1726882762.33634: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882762.33638: handler run complete 22225 1726882762.34762: variable 'ansible_facts' from source: unknown 22225 1726882762.35145: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882762.38421: variable 'ansible_facts' from source: unknown 22225 1726882762.44809: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882762.45839: attempt loop complete, returning result 22225 1726882762.45866: _execute() done 22225 1726882762.45873: dumping result to json 22225 1726882762.46161: done dumping result, returning 22225 1726882762.46327: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affc7ec-ae25-ec05-55b7-000000000202] 22225 1726882762.46330: sending task result for task 0affc7ec-ae25-ec05-55b7-000000000202 22225 1726882762.49209: done sending task result for task 0affc7ec-ae25-ec05-55b7-000000000202 22225 1726882762.49212: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 22225 1726882762.49308: no more pending results, returning what we have 22225 1726882762.49310: results queue empty 22225 1726882762.49311: checking for any_errors_fatal 22225 1726882762.49316: done checking for any_errors_fatal 22225 1726882762.49316: checking for max_fail_percentage 22225 1726882762.49318: done checking for max_fail_percentage 22225 1726882762.49319: checking to see if all hosts have failed and the running result is not ok 22225 1726882762.49320: done checking to see if all hosts have failed 22225 1726882762.49321: getting the remaining hosts for this loop 22225 1726882762.49324: done getting the remaining hosts for this loop 22225 1726882762.49328: getting the next task for host managed_node1 22225 1726882762.49335: done getting next task for host managed_node1 22225 1726882762.49338: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 22225 1726882762.49341: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882762.49353: getting variables 22225 1726882762.49354: in VariableManager get_vars() 22225 1726882762.49388: Calling all_inventory to load vars for managed_node1 22225 1726882762.49391: Calling groups_inventory to load vars for managed_node1 22225 1726882762.49393: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882762.49403: Calling all_plugins_play to load vars for managed_node1 22225 1726882762.49406: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882762.49409: Calling groups_plugins_play to load vars for managed_node1 22225 1726882762.51238: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882762.53530: done with get_vars() 22225 1726882762.53557: done getting variables 22225 1726882762.53630: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:39:22 -0400 (0:00:01.524) 0:00:17.929 ****** 22225 1726882762.53671: entering _queue_task() for managed_node1/debug 22225 1726882762.54018: worker is 1 (out of 1 available) 22225 1726882762.54036: exiting _queue_task() for managed_node1/debug 22225 1726882762.54049: done queuing things up, now waiting for results queue to drain 22225 1726882762.54050: waiting for pending results... 22225 1726882762.54445: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 22225 1726882762.54488: in run() - task 0affc7ec-ae25-ec05-55b7-000000000018 22225 1726882762.54512: variable 'ansible_search_path' from source: unknown 22225 1726882762.54520: variable 'ansible_search_path' from source: unknown 22225 1726882762.54575: calling self._execute() 22225 1726882762.54695: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882762.54708: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882762.54725: variable 'omit' from source: magic vars 22225 1726882762.55163: variable 'ansible_distribution_major_version' from source: facts 22225 1726882762.55186: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882762.55203: variable 'omit' from source: magic vars 22225 1726882762.55264: variable 'omit' from source: magic vars 22225 1726882762.55384: variable 'network_provider' from source: set_fact 22225 1726882762.55414: variable 'omit' from source: magic vars 22225 1726882762.55465: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22225 1726882762.55514: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22225 1726882762.55547: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22225 1726882762.55572: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882762.55593: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882762.55636: variable 'inventory_hostname' from source: host vars for 'managed_node1' 22225 1726882762.55646: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882762.55740: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882762.55775: Set connection var ansible_connection to ssh 22225 1726882762.55795: Set connection var ansible_pipelining to False 22225 1726882762.55809: Set connection var ansible_shell_executable to /bin/sh 22225 1726882762.55820: Set connection var ansible_timeout to 10 22225 1726882762.55830: Set connection var ansible_shell_type to sh 22225 1726882762.55841: Set connection var ansible_module_compression to ZIP_DEFLATED 22225 1726882762.55878: variable 'ansible_shell_executable' from source: unknown 22225 1726882762.55891: variable 'ansible_connection' from source: unknown 22225 1726882762.55901: variable 'ansible_module_compression' from source: unknown 22225 1726882762.55908: variable 'ansible_shell_type' from source: unknown 22225 1726882762.55915: variable 'ansible_shell_executable' from source: unknown 22225 1726882762.55924: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882762.55932: variable 'ansible_pipelining' from source: unknown 22225 1726882762.55939: variable 'ansible_timeout' from source: unknown 22225 1726882762.55947: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882762.56121: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22225 1726882762.56177: variable 'omit' from source: magic vars 22225 1726882762.56183: starting attempt loop 22225 1726882762.56187: running the handler 22225 1726882762.56215: handler run complete 22225 1726882762.56238: attempt loop complete, returning result 22225 1726882762.56246: _execute() done 22225 1726882762.56288: dumping result to json 22225 1726882762.56291: done dumping result, returning 22225 1726882762.56294: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [0affc7ec-ae25-ec05-55b7-000000000018] 22225 1726882762.56296: sending task result for task 0affc7ec-ae25-ec05-55b7-000000000018 ok: [managed_node1] => {} MSG: Using network provider: nm 22225 1726882762.56465: no more pending results, returning what we have 22225 1726882762.56469: results queue empty 22225 1726882762.56470: checking for any_errors_fatal 22225 1726882762.56485: done checking for any_errors_fatal 22225 1726882762.56486: checking for max_fail_percentage 22225 1726882762.56488: done checking for max_fail_percentage 22225 1726882762.56489: checking to see if all hosts have failed and the running result is not ok 22225 1726882762.56490: done checking to see if all hosts have failed 22225 1726882762.56491: getting the remaining hosts for this loop 22225 1726882762.56493: done getting the remaining hosts for this loop 22225 1726882762.56498: getting the next task for host managed_node1 22225 1726882762.56505: done getting next task for host managed_node1 22225 1726882762.56510: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 22225 1726882762.56514: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882762.56527: getting variables 22225 1726882762.56529: in VariableManager get_vars() 22225 1726882762.56576: Calling all_inventory to load vars for managed_node1 22225 1726882762.56582: Calling groups_inventory to load vars for managed_node1 22225 1726882762.56585: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882762.56598: Calling all_plugins_play to load vars for managed_node1 22225 1726882762.56601: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882762.56605: Calling groups_plugins_play to load vars for managed_node1 22225 1726882762.57439: done sending task result for task 0affc7ec-ae25-ec05-55b7-000000000018 22225 1726882762.57442: WORKER PROCESS EXITING 22225 1726882762.58614: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882762.60656: done with get_vars() 22225 1726882762.60694: done getting variables 22225 1726882762.60758: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:39:22 -0400 (0:00:00.071) 0:00:18.000 ****** 22225 1726882762.60796: entering _queue_task() for managed_node1/fail 22225 1726882762.61354: worker is 1 (out of 1 available) 22225 1726882762.61365: exiting _queue_task() for managed_node1/fail 22225 1726882762.61375: done queuing things up, now waiting for results queue to drain 22225 1726882762.61377: waiting for pending results... 22225 1726882762.61463: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 22225 1726882762.61629: in run() - task 0affc7ec-ae25-ec05-55b7-000000000019 22225 1726882762.61649: variable 'ansible_search_path' from source: unknown 22225 1726882762.61658: variable 'ansible_search_path' from source: unknown 22225 1726882762.61703: calling self._execute() 22225 1726882762.61802: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882762.61814: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882762.61834: variable 'omit' from source: magic vars 22225 1726882762.62235: variable 'ansible_distribution_major_version' from source: facts 22225 1726882762.62256: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882762.62394: variable 'network_state' from source: role '' defaults 22225 1726882762.62409: Evaluated conditional (network_state != {}): False 22225 1726882762.62473: when evaluation is False, skipping this task 22225 1726882762.62476: _execute() done 22225 1726882762.62478: dumping result to json 22225 1726882762.62483: done dumping result, returning 22225 1726882762.62486: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affc7ec-ae25-ec05-55b7-000000000019] 22225 1726882762.62534: sending task result for task 0affc7ec-ae25-ec05-55b7-000000000019 22225 1726882762.62767: done sending task result for task 0affc7ec-ae25-ec05-55b7-000000000019 22225 1726882762.62771: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 22225 1726882762.62823: no more pending results, returning what we have 22225 1726882762.62827: results queue empty 22225 1726882762.62829: checking for any_errors_fatal 22225 1726882762.62836: done checking for any_errors_fatal 22225 1726882762.62836: checking for max_fail_percentage 22225 1726882762.62838: done checking for max_fail_percentage 22225 1726882762.62839: checking to see if all hosts have failed and the running result is not ok 22225 1726882762.62840: done checking to see if all hosts have failed 22225 1726882762.62841: getting the remaining hosts for this loop 22225 1726882762.62843: done getting the remaining hosts for this loop 22225 1726882762.62847: getting the next task for host managed_node1 22225 1726882762.62852: done getting next task for host managed_node1 22225 1726882762.62856: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 22225 1726882762.62859: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882762.62876: getting variables 22225 1726882762.62877: in VariableManager get_vars() 22225 1726882762.62926: Calling all_inventory to load vars for managed_node1 22225 1726882762.62929: Calling groups_inventory to load vars for managed_node1 22225 1726882762.62931: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882762.62944: Calling all_plugins_play to load vars for managed_node1 22225 1726882762.62947: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882762.62950: Calling groups_plugins_play to load vars for managed_node1 22225 1726882762.65232: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882762.68054: done with get_vars() 22225 1726882762.68086: done getting variables 22225 1726882762.68157: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:39:22 -0400 (0:00:00.073) 0:00:18.074 ****** 22225 1726882762.68196: entering _queue_task() for managed_node1/fail 22225 1726882762.68946: worker is 1 (out of 1 available) 22225 1726882762.68960: exiting _queue_task() for managed_node1/fail 22225 1726882762.68969: done queuing things up, now waiting for results queue to drain 22225 1726882762.68971: waiting for pending results... 22225 1726882762.69340: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 22225 1726882762.69345: in run() - task 0affc7ec-ae25-ec05-55b7-00000000001a 22225 1726882762.69348: variable 'ansible_search_path' from source: unknown 22225 1726882762.69351: variable 'ansible_search_path' from source: unknown 22225 1726882762.69366: calling self._execute() 22225 1726882762.69468: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882762.69482: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882762.69505: variable 'omit' from source: magic vars 22225 1726882762.69916: variable 'ansible_distribution_major_version' from source: facts 22225 1726882762.69936: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882762.70071: variable 'network_state' from source: role '' defaults 22225 1726882762.70096: Evaluated conditional (network_state != {}): False 22225 1726882762.70104: when evaluation is False, skipping this task 22225 1726882762.70111: _execute() done 22225 1726882762.70118: dumping result to json 22225 1726882762.70128: done dumping result, returning 22225 1726882762.70140: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affc7ec-ae25-ec05-55b7-00000000001a] 22225 1726882762.70151: sending task result for task 0affc7ec-ae25-ec05-55b7-00000000001a skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 22225 1726882762.70307: no more pending results, returning what we have 22225 1726882762.70312: results queue empty 22225 1726882762.70313: checking for any_errors_fatal 22225 1726882762.70326: done checking for any_errors_fatal 22225 1726882762.70327: checking for max_fail_percentage 22225 1726882762.70329: done checking for max_fail_percentage 22225 1726882762.70330: checking to see if all hosts have failed and the running result is not ok 22225 1726882762.70331: done checking to see if all hosts have failed 22225 1726882762.70332: getting the remaining hosts for this loop 22225 1726882762.70334: done getting the remaining hosts for this loop 22225 1726882762.70339: getting the next task for host managed_node1 22225 1726882762.70346: done getting next task for host managed_node1 22225 1726882762.70351: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 22225 1726882762.70355: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882762.70372: getting variables 22225 1726882762.70374: in VariableManager get_vars() 22225 1726882762.70525: Calling all_inventory to load vars for managed_node1 22225 1726882762.70529: Calling groups_inventory to load vars for managed_node1 22225 1726882762.70532: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882762.70546: Calling all_plugins_play to load vars for managed_node1 22225 1726882762.70548: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882762.70551: Calling groups_plugins_play to load vars for managed_node1 22225 1726882762.71239: done sending task result for task 0affc7ec-ae25-ec05-55b7-00000000001a 22225 1726882762.71243: WORKER PROCESS EXITING 22225 1726882762.72367: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882762.74607: done with get_vars() 22225 1726882762.74633: done getting variables 22225 1726882762.74696: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:39:22 -0400 (0:00:00.065) 0:00:18.139 ****** 22225 1726882762.74733: entering _queue_task() for managed_node1/fail 22225 1726882762.75031: worker is 1 (out of 1 available) 22225 1726882762.75045: exiting _queue_task() for managed_node1/fail 22225 1726882762.75057: done queuing things up, now waiting for results queue to drain 22225 1726882762.75059: waiting for pending results... 22225 1726882762.75345: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 22225 1726882762.75490: in run() - task 0affc7ec-ae25-ec05-55b7-00000000001b 22225 1726882762.75511: variable 'ansible_search_path' from source: unknown 22225 1726882762.75519: variable 'ansible_search_path' from source: unknown 22225 1726882762.75562: calling self._execute() 22225 1726882762.75664: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882762.75681: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882762.75697: variable 'omit' from source: magic vars 22225 1726882762.76095: variable 'ansible_distribution_major_version' from source: facts 22225 1726882762.76116: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882762.76315: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22225 1726882762.79428: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22225 1726882762.79563: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22225 1726882762.79664: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22225 1726882762.79762: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22225 1726882762.79792: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22225 1726882762.80058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22225 1726882762.80100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22225 1726882762.80136: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22225 1726882762.80430: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22225 1726882762.80434: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22225 1726882762.80489: variable 'ansible_distribution_major_version' from source: facts 22225 1726882762.80565: Evaluated conditional (ansible_distribution_major_version | int > 9): True 22225 1726882762.80816: variable 'ansible_distribution' from source: facts 22225 1726882762.80882: variable '__network_rh_distros' from source: role '' defaults 22225 1726882762.80898: Evaluated conditional (ansible_distribution in __network_rh_distros): False 22225 1726882762.80932: when evaluation is False, skipping this task 22225 1726882762.80940: _execute() done 22225 1726882762.80948: dumping result to json 22225 1726882762.80956: done dumping result, returning 22225 1726882762.80991: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affc7ec-ae25-ec05-55b7-00000000001b] 22225 1726882762.81036: sending task result for task 0affc7ec-ae25-ec05-55b7-00000000001b 22225 1726882762.81378: done sending task result for task 0affc7ec-ae25-ec05-55b7-00000000001b 22225 1726882762.81385: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 22225 1726882762.81543: no more pending results, returning what we have 22225 1726882762.81548: results queue empty 22225 1726882762.81549: checking for any_errors_fatal 22225 1726882762.81556: done checking for any_errors_fatal 22225 1726882762.81557: checking for max_fail_percentage 22225 1726882762.81559: done checking for max_fail_percentage 22225 1726882762.81560: checking to see if all hosts have failed and the running result is not ok 22225 1726882762.81561: done checking to see if all hosts have failed 22225 1726882762.81561: getting the remaining hosts for this loop 22225 1726882762.81563: done getting the remaining hosts for this loop 22225 1726882762.81568: getting the next task for host managed_node1 22225 1726882762.81577: done getting next task for host managed_node1 22225 1726882762.81584: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 22225 1726882762.81588: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882762.81604: getting variables 22225 1726882762.81607: in VariableManager get_vars() 22225 1726882762.81657: Calling all_inventory to load vars for managed_node1 22225 1726882762.81660: Calling groups_inventory to load vars for managed_node1 22225 1726882762.81663: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882762.81674: Calling all_plugins_play to load vars for managed_node1 22225 1726882762.81677: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882762.81683: Calling groups_plugins_play to load vars for managed_node1 22225 1726882762.83860: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882762.86048: done with get_vars() 22225 1726882762.86074: done getting variables 22225 1726882762.86178: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:39:22 -0400 (0:00:00.114) 0:00:18.254 ****** 22225 1726882762.86213: entering _queue_task() for managed_node1/dnf 22225 1726882762.86519: worker is 1 (out of 1 available) 22225 1726882762.86637: exiting _queue_task() for managed_node1/dnf 22225 1726882762.86648: done queuing things up, now waiting for results queue to drain 22225 1726882762.86650: waiting for pending results... 22225 1726882762.86846: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 22225 1726882762.87129: in run() - task 0affc7ec-ae25-ec05-55b7-00000000001c 22225 1726882762.87133: variable 'ansible_search_path' from source: unknown 22225 1726882762.87135: variable 'ansible_search_path' from source: unknown 22225 1726882762.87139: calling self._execute() 22225 1726882762.87142: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882762.87155: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882762.87170: variable 'omit' from source: magic vars 22225 1726882762.87583: variable 'ansible_distribution_major_version' from source: facts 22225 1726882762.87601: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882762.87828: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22225 1726882762.90318: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22225 1726882762.90766: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22225 1726882762.90814: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22225 1726882762.90861: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22225 1726882762.90896: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22225 1726882762.90993: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22225 1726882762.91031: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22225 1726882762.91068: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22225 1726882762.91120: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22225 1726882762.91144: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22225 1726882762.91262: variable 'ansible_distribution' from source: facts 22225 1726882762.91276: variable 'ansible_distribution_major_version' from source: facts 22225 1726882762.91292: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 22225 1726882762.91411: variable '__network_wireless_connections_defined' from source: role '' defaults 22225 1726882762.91561: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22225 1726882762.91827: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22225 1726882762.91831: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22225 1726882762.91838: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22225 1726882762.91861: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22225 1726882762.91914: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22225 1726882762.91983: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22225 1726882762.92229: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22225 1726882762.92233: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22225 1726882762.92235: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22225 1726882762.92293: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22225 1726882762.92366: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22225 1726882762.92400: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22225 1726882762.92494: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22225 1726882762.92573: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22225 1726882762.92973: variable 'network_connections' from source: task vars 22225 1726882762.93042: variable 'interface' from source: play vars 22225 1726882762.93428: variable 'interface' from source: play vars 22225 1726882762.93445: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22225 1726882762.93766: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22225 1726882762.93812: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22225 1726882762.93899: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22225 1726882762.94007: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22225 1726882762.94061: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22225 1726882762.94176: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22225 1726882762.94224: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22225 1726882762.94352: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22225 1726882762.94520: variable '__network_team_connections_defined' from source: role '' defaults 22225 1726882762.95207: variable 'network_connections' from source: task vars 22225 1726882762.95218: variable 'interface' from source: play vars 22225 1726882762.95296: variable 'interface' from source: play vars 22225 1726882762.95337: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 22225 1726882762.95345: when evaluation is False, skipping this task 22225 1726882762.95352: _execute() done 22225 1726882762.95359: dumping result to json 22225 1726882762.95367: done dumping result, returning 22225 1726882762.95378: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affc7ec-ae25-ec05-55b7-00000000001c] 22225 1726882762.95395: sending task result for task 0affc7ec-ae25-ec05-55b7-00000000001c 22225 1726882762.95761: done sending task result for task 0affc7ec-ae25-ec05-55b7-00000000001c 22225 1726882762.95764: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 22225 1726882762.95811: no more pending results, returning what we have 22225 1726882762.95815: results queue empty 22225 1726882762.95816: checking for any_errors_fatal 22225 1726882762.95821: done checking for any_errors_fatal 22225 1726882762.95823: checking for max_fail_percentage 22225 1726882762.95825: done checking for max_fail_percentage 22225 1726882762.95826: checking to see if all hosts have failed and the running result is not ok 22225 1726882762.95827: done checking to see if all hosts have failed 22225 1726882762.95827: getting the remaining hosts for this loop 22225 1726882762.95829: done getting the remaining hosts for this loop 22225 1726882762.95833: getting the next task for host managed_node1 22225 1726882762.95839: done getting next task for host managed_node1 22225 1726882762.95843: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 22225 1726882762.95846: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882762.95860: getting variables 22225 1726882762.95862: in VariableManager get_vars() 22225 1726882762.95905: Calling all_inventory to load vars for managed_node1 22225 1726882762.95908: Calling groups_inventory to load vars for managed_node1 22225 1726882762.95911: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882762.95921: Calling all_plugins_play to load vars for managed_node1 22225 1726882762.95925: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882762.95929: Calling groups_plugins_play to load vars for managed_node1 22225 1726882762.97885: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882763.01365: done with get_vars() 22225 1726882763.01408: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 22225 1726882763.01499: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:39:23 -0400 (0:00:00.155) 0:00:18.410 ****** 22225 1726882763.01740: entering _queue_task() for managed_node1/yum 22225 1726882763.01742: Creating lock for yum 22225 1726882763.02204: worker is 1 (out of 1 available) 22225 1726882763.02219: exiting _queue_task() for managed_node1/yum 22225 1726882763.02776: done queuing things up, now waiting for results queue to drain 22225 1726882763.02778: waiting for pending results... 22225 1726882763.03019: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 22225 1726882763.03207: in run() - task 0affc7ec-ae25-ec05-55b7-00000000001d 22225 1726882763.03261: variable 'ansible_search_path' from source: unknown 22225 1726882763.03429: variable 'ansible_search_path' from source: unknown 22225 1726882763.03432: calling self._execute() 22225 1726882763.03590: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882763.03729: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882763.03733: variable 'omit' from source: magic vars 22225 1726882763.04537: variable 'ansible_distribution_major_version' from source: facts 22225 1726882763.04556: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882763.04928: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22225 1726882763.10130: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22225 1726882763.10255: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22225 1726882763.10381: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22225 1726882763.10529: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22225 1726882763.10533: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22225 1726882763.10743: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22225 1726882763.10840: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22225 1726882763.10945: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22225 1726882763.11001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22225 1726882763.11242: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22225 1726882763.11369: variable 'ansible_distribution_major_version' from source: facts 22225 1726882763.11394: Evaluated conditional (ansible_distribution_major_version | int < 8): False 22225 1726882763.11401: when evaluation is False, skipping this task 22225 1726882763.11468: _execute() done 22225 1726882763.11527: dumping result to json 22225 1726882763.11531: done dumping result, returning 22225 1726882763.11534: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affc7ec-ae25-ec05-55b7-00000000001d] 22225 1726882763.11537: sending task result for task 0affc7ec-ae25-ec05-55b7-00000000001d skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 22225 1726882763.11919: no more pending results, returning what we have 22225 1726882763.11926: results queue empty 22225 1726882763.11927: checking for any_errors_fatal 22225 1726882763.11936: done checking for any_errors_fatal 22225 1726882763.11937: checking for max_fail_percentage 22225 1726882763.11939: done checking for max_fail_percentage 22225 1726882763.11939: checking to see if all hosts have failed and the running result is not ok 22225 1726882763.11941: done checking to see if all hosts have failed 22225 1726882763.11941: getting the remaining hosts for this loop 22225 1726882763.11943: done getting the remaining hosts for this loop 22225 1726882763.11947: getting the next task for host managed_node1 22225 1726882763.11955: done getting next task for host managed_node1 22225 1726882763.11960: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 22225 1726882763.11963: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882763.11982: getting variables 22225 1726882763.11984: in VariableManager get_vars() 22225 1726882763.12258: Calling all_inventory to load vars for managed_node1 22225 1726882763.12262: Calling groups_inventory to load vars for managed_node1 22225 1726882763.12264: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882763.12276: Calling all_plugins_play to load vars for managed_node1 22225 1726882763.12281: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882763.12285: Calling groups_plugins_play to load vars for managed_node1 22225 1726882763.12868: done sending task result for task 0affc7ec-ae25-ec05-55b7-00000000001d 22225 1726882763.12872: WORKER PROCESS EXITING 22225 1726882763.15235: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882763.23561: done with get_vars() 22225 1726882763.23591: done getting variables 22225 1726882763.23647: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:39:23 -0400 (0:00:00.219) 0:00:18.629 ****** 22225 1726882763.23678: entering _queue_task() for managed_node1/fail 22225 1726882763.24018: worker is 1 (out of 1 available) 22225 1726882763.24034: exiting _queue_task() for managed_node1/fail 22225 1726882763.24048: done queuing things up, now waiting for results queue to drain 22225 1726882763.24050: waiting for pending results... 22225 1726882763.24337: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 22225 1726882763.24502: in run() - task 0affc7ec-ae25-ec05-55b7-00000000001e 22225 1726882763.24527: variable 'ansible_search_path' from source: unknown 22225 1726882763.24564: variable 'ansible_search_path' from source: unknown 22225 1726882763.24595: calling self._execute() 22225 1726882763.24714: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882763.24786: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882763.24790: variable 'omit' from source: magic vars 22225 1726882763.25789: variable 'ansible_distribution_major_version' from source: facts 22225 1726882763.25793: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882763.26129: variable '__network_wireless_connections_defined' from source: role '' defaults 22225 1726882763.26728: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22225 1726882763.29584: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22225 1726882763.29696: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22225 1726882763.29748: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22225 1726882763.29795: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22225 1726882763.29835: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22225 1726882763.29938: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22225 1726882763.30030: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22225 1726882763.30035: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22225 1726882763.30063: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22225 1726882763.30087: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22225 1726882763.30149: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22225 1726882763.30178: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22225 1726882763.30214: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22225 1726882763.30267: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22225 1726882763.30287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22225 1726882763.30355: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22225 1726882763.30374: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22225 1726882763.30461: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22225 1726882763.30465: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22225 1726882763.30475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22225 1726882763.30685: variable 'network_connections' from source: task vars 22225 1726882763.30709: variable 'interface' from source: play vars 22225 1726882763.30804: variable 'interface' from source: play vars 22225 1726882763.30912: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22225 1726882763.31103: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22225 1726882763.31150: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22225 1726882763.31240: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22225 1726882763.31243: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22225 1726882763.31277: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22225 1726882763.31308: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22225 1726882763.31345: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22225 1726882763.31393: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22225 1726882763.31486: variable '__network_team_connections_defined' from source: role '' defaults 22225 1726882763.31798: variable 'network_connections' from source: task vars 22225 1726882763.31810: variable 'interface' from source: play vars 22225 1726882763.31886: variable 'interface' from source: play vars 22225 1726882763.31940: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 22225 1726882763.31944: when evaluation is False, skipping this task 22225 1726882763.32127: _execute() done 22225 1726882763.32130: dumping result to json 22225 1726882763.32132: done dumping result, returning 22225 1726882763.32135: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affc7ec-ae25-ec05-55b7-00000000001e] 22225 1726882763.32137: sending task result for task 0affc7ec-ae25-ec05-55b7-00000000001e 22225 1726882763.32224: done sending task result for task 0affc7ec-ae25-ec05-55b7-00000000001e 22225 1726882763.32228: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 22225 1726882763.32283: no more pending results, returning what we have 22225 1726882763.32288: results queue empty 22225 1726882763.32289: checking for any_errors_fatal 22225 1726882763.32297: done checking for any_errors_fatal 22225 1726882763.32298: checking for max_fail_percentage 22225 1726882763.32300: done checking for max_fail_percentage 22225 1726882763.32301: checking to see if all hosts have failed and the running result is not ok 22225 1726882763.32302: done checking to see if all hosts have failed 22225 1726882763.32303: getting the remaining hosts for this loop 22225 1726882763.32305: done getting the remaining hosts for this loop 22225 1726882763.32310: getting the next task for host managed_node1 22225 1726882763.32316: done getting next task for host managed_node1 22225 1726882763.32320: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 22225 1726882763.32325: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882763.32340: getting variables 22225 1726882763.32342: in VariableManager get_vars() 22225 1726882763.32388: Calling all_inventory to load vars for managed_node1 22225 1726882763.32392: Calling groups_inventory to load vars for managed_node1 22225 1726882763.32394: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882763.32406: Calling all_plugins_play to load vars for managed_node1 22225 1726882763.32409: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882763.32412: Calling groups_plugins_play to load vars for managed_node1 22225 1726882763.34434: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882763.36944: done with get_vars() 22225 1726882763.36974: done getting variables 22225 1726882763.37045: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:39:23 -0400 (0:00:00.133) 0:00:18.763 ****** 22225 1726882763.37081: entering _queue_task() for managed_node1/package 22225 1726882763.37634: worker is 1 (out of 1 available) 22225 1726882763.37646: exiting _queue_task() for managed_node1/package 22225 1726882763.37656: done queuing things up, now waiting for results queue to drain 22225 1726882763.37658: waiting for pending results... 22225 1726882763.37843: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 22225 1726882763.37994: in run() - task 0affc7ec-ae25-ec05-55b7-00000000001f 22225 1726882763.37999: variable 'ansible_search_path' from source: unknown 22225 1726882763.38002: variable 'ansible_search_path' from source: unknown 22225 1726882763.38005: calling self._execute() 22225 1726882763.38109: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882763.38121: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882763.38137: variable 'omit' from source: magic vars 22225 1726882763.38583: variable 'ansible_distribution_major_version' from source: facts 22225 1726882763.38608: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882763.38838: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22225 1726882763.39191: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22225 1726882763.39197: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22225 1726882763.39285: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22225 1726882763.39332: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22225 1726882763.39466: variable 'network_packages' from source: role '' defaults 22225 1726882763.39594: variable '__network_provider_setup' from source: role '' defaults 22225 1726882763.39608: variable '__network_service_name_default_nm' from source: role '' defaults 22225 1726882763.39695: variable '__network_service_name_default_nm' from source: role '' defaults 22225 1726882763.39729: variable '__network_packages_default_nm' from source: role '' defaults 22225 1726882763.39781: variable '__network_packages_default_nm' from source: role '' defaults 22225 1726882763.40010: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22225 1726882763.42761: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22225 1726882763.42790: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22225 1726882763.42835: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22225 1726882763.42878: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22225 1726882763.42911: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22225 1726882763.43009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22225 1726882763.43039: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22225 1726882763.43086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22225 1726882763.43110: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22225 1726882763.43126: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22225 1726882763.43195: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22225 1726882763.43201: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22225 1726882763.43304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22225 1726882763.43307: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22225 1726882763.43310: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22225 1726882763.43521: variable '__network_packages_default_gobject_packages' from source: role '' defaults 22225 1726882763.43661: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22225 1726882763.43687: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22225 1726882763.43712: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22225 1726882763.43754: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22225 1726882763.43768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22225 1726882763.43863: variable 'ansible_python' from source: facts 22225 1726882763.43892: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 22225 1726882763.43977: variable '__network_wpa_supplicant_required' from source: role '' defaults 22225 1726882763.44067: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 22225 1726882763.44206: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22225 1726882763.44231: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22225 1726882763.44256: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22225 1726882763.44299: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22225 1726882763.44313: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22225 1726882763.44368: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22225 1726882763.44394: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22225 1726882763.44418: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22225 1726882763.44461: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22225 1726882763.44474: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22225 1726882763.44630: variable 'network_connections' from source: task vars 22225 1726882763.44637: variable 'interface' from source: play vars 22225 1726882763.44749: variable 'interface' from source: play vars 22225 1726882763.44831: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22225 1726882763.44857: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22225 1726882763.44890: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22225 1726882763.44925: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22225 1726882763.44974: variable '__network_wireless_connections_defined' from source: role '' defaults 22225 1726882763.45428: variable 'network_connections' from source: task vars 22225 1726882763.45431: variable 'interface' from source: play vars 22225 1726882763.45434: variable 'interface' from source: play vars 22225 1726882763.45459: variable '__network_packages_default_wireless' from source: role '' defaults 22225 1726882763.45550: variable '__network_wireless_connections_defined' from source: role '' defaults 22225 1726882763.45885: variable 'network_connections' from source: task vars 22225 1726882763.45889: variable 'interface' from source: play vars 22225 1726882763.45994: variable 'interface' from source: play vars 22225 1726882763.46027: variable '__network_packages_default_team' from source: role '' defaults 22225 1726882763.46093: variable '__network_team_connections_defined' from source: role '' defaults 22225 1726882763.46432: variable 'network_connections' from source: task vars 22225 1726882763.46436: variable 'interface' from source: play vars 22225 1726882763.46503: variable 'interface' from source: play vars 22225 1726882763.46567: variable '__network_service_name_default_initscripts' from source: role '' defaults 22225 1726882763.46637: variable '__network_service_name_default_initscripts' from source: role '' defaults 22225 1726882763.46641: variable '__network_packages_default_initscripts' from source: role '' defaults 22225 1726882763.46748: variable '__network_packages_default_initscripts' from source: role '' defaults 22225 1726882763.46941: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 22225 1726882763.47464: variable 'network_connections' from source: task vars 22225 1726882763.47468: variable 'interface' from source: play vars 22225 1726882763.47533: variable 'interface' from source: play vars 22225 1726882763.47543: variable 'ansible_distribution' from source: facts 22225 1726882763.47546: variable '__network_rh_distros' from source: role '' defaults 22225 1726882763.47553: variable 'ansible_distribution_major_version' from source: facts 22225 1726882763.47617: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 22225 1726882763.47754: variable 'ansible_distribution' from source: facts 22225 1726882763.47758: variable '__network_rh_distros' from source: role '' defaults 22225 1726882763.47764: variable 'ansible_distribution_major_version' from source: facts 22225 1726882763.47771: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 22225 1726882763.47950: variable 'ansible_distribution' from source: facts 22225 1726882763.48026: variable '__network_rh_distros' from source: role '' defaults 22225 1726882763.48029: variable 'ansible_distribution_major_version' from source: facts 22225 1726882763.48032: variable 'network_provider' from source: set_fact 22225 1726882763.48035: variable 'ansible_facts' from source: unknown 22225 1726882763.48852: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 22225 1726882763.48856: when evaluation is False, skipping this task 22225 1726882763.48859: _execute() done 22225 1726882763.48861: dumping result to json 22225 1726882763.48863: done dumping result, returning 22225 1726882763.48873: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [0affc7ec-ae25-ec05-55b7-00000000001f] 22225 1726882763.48878: sending task result for task 0affc7ec-ae25-ec05-55b7-00000000001f 22225 1726882763.48992: done sending task result for task 0affc7ec-ae25-ec05-55b7-00000000001f 22225 1726882763.48995: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 22225 1726882763.49048: no more pending results, returning what we have 22225 1726882763.49052: results queue empty 22225 1726882763.49053: checking for any_errors_fatal 22225 1726882763.49060: done checking for any_errors_fatal 22225 1726882763.49061: checking for max_fail_percentage 22225 1726882763.49063: done checking for max_fail_percentage 22225 1726882763.49063: checking to see if all hosts have failed and the running result is not ok 22225 1726882763.49064: done checking to see if all hosts have failed 22225 1726882763.49065: getting the remaining hosts for this loop 22225 1726882763.49067: done getting the remaining hosts for this loop 22225 1726882763.49071: getting the next task for host managed_node1 22225 1726882763.49079: done getting next task for host managed_node1 22225 1726882763.49083: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 22225 1726882763.49086: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882763.49100: getting variables 22225 1726882763.49102: in VariableManager get_vars() 22225 1726882763.49146: Calling all_inventory to load vars for managed_node1 22225 1726882763.49149: Calling groups_inventory to load vars for managed_node1 22225 1726882763.49151: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882763.49162: Calling all_plugins_play to load vars for managed_node1 22225 1726882763.49164: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882763.49167: Calling groups_plugins_play to load vars for managed_node1 22225 1726882763.51069: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882763.53253: done with get_vars() 22225 1726882763.53288: done getting variables 22225 1726882763.53360: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:39:23 -0400 (0:00:00.163) 0:00:18.926 ****** 22225 1726882763.53401: entering _queue_task() for managed_node1/package 22225 1726882763.53861: worker is 1 (out of 1 available) 22225 1726882763.53875: exiting _queue_task() for managed_node1/package 22225 1726882763.53889: done queuing things up, now waiting for results queue to drain 22225 1726882763.53890: waiting for pending results... 22225 1726882763.54245: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 22225 1726882763.54272: in run() - task 0affc7ec-ae25-ec05-55b7-000000000020 22225 1726882763.54298: variable 'ansible_search_path' from source: unknown 22225 1726882763.54307: variable 'ansible_search_path' from source: unknown 22225 1726882763.54359: calling self._execute() 22225 1726882763.54557: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882763.54562: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882763.54565: variable 'omit' from source: magic vars 22225 1726882763.54896: variable 'ansible_distribution_major_version' from source: facts 22225 1726882763.54914: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882763.55052: variable 'network_state' from source: role '' defaults 22225 1726882763.55069: Evaluated conditional (network_state != {}): False 22225 1726882763.55076: when evaluation is False, skipping this task 22225 1726882763.55086: _execute() done 22225 1726882763.55098: dumping result to json 22225 1726882763.55106: done dumping result, returning 22225 1726882763.55118: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affc7ec-ae25-ec05-55b7-000000000020] 22225 1726882763.55131: sending task result for task 0affc7ec-ae25-ec05-55b7-000000000020 22225 1726882763.55374: done sending task result for task 0affc7ec-ae25-ec05-55b7-000000000020 22225 1726882763.55377: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 22225 1726882763.55434: no more pending results, returning what we have 22225 1726882763.55439: results queue empty 22225 1726882763.55440: checking for any_errors_fatal 22225 1726882763.55447: done checking for any_errors_fatal 22225 1726882763.55447: checking for max_fail_percentage 22225 1726882763.55449: done checking for max_fail_percentage 22225 1726882763.55450: checking to see if all hosts have failed and the running result is not ok 22225 1726882763.55451: done checking to see if all hosts have failed 22225 1726882763.55452: getting the remaining hosts for this loop 22225 1726882763.55454: done getting the remaining hosts for this loop 22225 1726882763.55459: getting the next task for host managed_node1 22225 1726882763.55467: done getting next task for host managed_node1 22225 1726882763.55471: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 22225 1726882763.55475: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882763.55494: getting variables 22225 1726882763.55496: in VariableManager get_vars() 22225 1726882763.55544: Calling all_inventory to load vars for managed_node1 22225 1726882763.55547: Calling groups_inventory to load vars for managed_node1 22225 1726882763.55550: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882763.55564: Calling all_plugins_play to load vars for managed_node1 22225 1726882763.55567: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882763.55570: Calling groups_plugins_play to load vars for managed_node1 22225 1726882763.57436: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882763.59793: done with get_vars() 22225 1726882763.59818: done getting variables 22225 1726882763.59883: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:39:23 -0400 (0:00:00.065) 0:00:18.991 ****** 22225 1726882763.59917: entering _queue_task() for managed_node1/package 22225 1726882763.60267: worker is 1 (out of 1 available) 22225 1726882763.60282: exiting _queue_task() for managed_node1/package 22225 1726882763.60295: done queuing things up, now waiting for results queue to drain 22225 1726882763.60297: waiting for pending results... 22225 1726882763.60595: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 22225 1726882763.60928: in run() - task 0affc7ec-ae25-ec05-55b7-000000000021 22225 1726882763.60931: variable 'ansible_search_path' from source: unknown 22225 1726882763.60934: variable 'ansible_search_path' from source: unknown 22225 1726882763.60937: calling self._execute() 22225 1726882763.60940: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882763.60943: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882763.60945: variable 'omit' from source: magic vars 22225 1726882763.61351: variable 'ansible_distribution_major_version' from source: facts 22225 1726882763.61368: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882763.61854: variable 'network_state' from source: role '' defaults 22225 1726882763.61889: Evaluated conditional (network_state != {}): False 22225 1726882763.61897: when evaluation is False, skipping this task 22225 1726882763.61905: _execute() done 22225 1726882763.61912: dumping result to json 22225 1726882763.61921: done dumping result, returning 22225 1726882763.61937: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affc7ec-ae25-ec05-55b7-000000000021] 22225 1726882763.61953: sending task result for task 0affc7ec-ae25-ec05-55b7-000000000021 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 22225 1726882763.62193: no more pending results, returning what we have 22225 1726882763.62198: results queue empty 22225 1726882763.62199: checking for any_errors_fatal 22225 1726882763.62210: done checking for any_errors_fatal 22225 1726882763.62211: checking for max_fail_percentage 22225 1726882763.62213: done checking for max_fail_percentage 22225 1726882763.62214: checking to see if all hosts have failed and the running result is not ok 22225 1726882763.62215: done checking to see if all hosts have failed 22225 1726882763.62215: getting the remaining hosts for this loop 22225 1726882763.62217: done getting the remaining hosts for this loop 22225 1726882763.62221: getting the next task for host managed_node1 22225 1726882763.62231: done getting next task for host managed_node1 22225 1726882763.62235: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 22225 1726882763.62239: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882763.62256: getting variables 22225 1726882763.62257: in VariableManager get_vars() 22225 1726882763.62307: Calling all_inventory to load vars for managed_node1 22225 1726882763.62310: Calling groups_inventory to load vars for managed_node1 22225 1726882763.62312: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882763.62431: Calling all_plugins_play to load vars for managed_node1 22225 1726882763.62436: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882763.62442: done sending task result for task 0affc7ec-ae25-ec05-55b7-000000000021 22225 1726882763.62445: WORKER PROCESS EXITING 22225 1726882763.62449: Calling groups_plugins_play to load vars for managed_node1 22225 1726882763.64331: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882763.66759: done with get_vars() 22225 1726882763.66819: done getting variables 22225 1726882763.66937: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:39:23 -0400 (0:00:00.070) 0:00:19.062 ****** 22225 1726882763.66969: entering _queue_task() for managed_node1/service 22225 1726882763.66971: Creating lock for service 22225 1726882763.67428: worker is 1 (out of 1 available) 22225 1726882763.67447: exiting _queue_task() for managed_node1/service 22225 1726882763.67458: done queuing things up, now waiting for results queue to drain 22225 1726882763.67459: waiting for pending results... 22225 1726882763.67701: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 22225 1726882763.67930: in run() - task 0affc7ec-ae25-ec05-55b7-000000000022 22225 1726882763.67934: variable 'ansible_search_path' from source: unknown 22225 1726882763.67937: variable 'ansible_search_path' from source: unknown 22225 1726882763.67941: calling self._execute() 22225 1726882763.68228: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882763.68233: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882763.68236: variable 'omit' from source: magic vars 22225 1726882763.68475: variable 'ansible_distribution_major_version' from source: facts 22225 1726882763.68521: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882763.68627: variable '__network_wireless_connections_defined' from source: role '' defaults 22225 1726882763.68856: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22225 1726882763.71384: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22225 1726882763.71854: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22225 1726882763.71899: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22225 1726882763.71946: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22225 1726882763.71972: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22225 1726882763.72075: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22225 1726882763.72106: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22225 1726882763.72165: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22225 1726882763.72208: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22225 1726882763.72228: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22225 1726882763.72375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22225 1726882763.72382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22225 1726882763.72385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22225 1726882763.72388: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22225 1726882763.72391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22225 1726882763.72539: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22225 1726882763.72564: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22225 1726882763.72589: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22225 1726882763.72631: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22225 1726882763.72647: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22225 1726882763.73048: variable 'network_connections' from source: task vars 22225 1726882763.73142: variable 'interface' from source: play vars 22225 1726882763.73214: variable 'interface' from source: play vars 22225 1726882763.73485: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22225 1726882763.73860: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22225 1726882763.73864: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22225 1726882763.73866: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22225 1726882763.73895: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22225 1726882763.73964: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22225 1726882763.73986: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22225 1726882763.74030: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22225 1726882763.74083: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22225 1726882763.74112: variable '__network_team_connections_defined' from source: role '' defaults 22225 1726882763.74383: variable 'network_connections' from source: task vars 22225 1726882763.74386: variable 'interface' from source: play vars 22225 1726882763.74476: variable 'interface' from source: play vars 22225 1726882763.74490: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 22225 1726882763.74493: when evaluation is False, skipping this task 22225 1726882763.74496: _execute() done 22225 1726882763.74499: dumping result to json 22225 1726882763.74501: done dumping result, returning 22225 1726882763.74513: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affc7ec-ae25-ec05-55b7-000000000022] 22225 1726882763.74516: sending task result for task 0affc7ec-ae25-ec05-55b7-000000000022 skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 22225 1726882763.74789: no more pending results, returning what we have 22225 1726882763.74793: results queue empty 22225 1726882763.74794: checking for any_errors_fatal 22225 1726882763.74799: done checking for any_errors_fatal 22225 1726882763.74800: checking for max_fail_percentage 22225 1726882763.74802: done checking for max_fail_percentage 22225 1726882763.74803: checking to see if all hosts have failed and the running result is not ok 22225 1726882763.74804: done checking to see if all hosts have failed 22225 1726882763.74804: getting the remaining hosts for this loop 22225 1726882763.74806: done getting the remaining hosts for this loop 22225 1726882763.74809: getting the next task for host managed_node1 22225 1726882763.74814: done getting next task for host managed_node1 22225 1726882763.74818: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 22225 1726882763.74821: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882763.74836: getting variables 22225 1726882763.74838: in VariableManager get_vars() 22225 1726882763.74877: Calling all_inventory to load vars for managed_node1 22225 1726882763.74882: Calling groups_inventory to load vars for managed_node1 22225 1726882763.74884: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882763.74894: Calling all_plugins_play to load vars for managed_node1 22225 1726882763.74897: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882763.74899: Calling groups_plugins_play to load vars for managed_node1 22225 1726882763.75426: done sending task result for task 0affc7ec-ae25-ec05-55b7-000000000022 22225 1726882763.75431: WORKER PROCESS EXITING 22225 1726882763.77586: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882763.81150: done with get_vars() 22225 1726882763.81299: done getting variables 22225 1726882763.81369: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:39:23 -0400 (0:00:00.144) 0:00:19.206 ****** 22225 1726882763.81403: entering _queue_task() for managed_node1/service 22225 1726882763.81982: worker is 1 (out of 1 available) 22225 1726882763.81996: exiting _queue_task() for managed_node1/service 22225 1726882763.82009: done queuing things up, now waiting for results queue to drain 22225 1726882763.82010: waiting for pending results... 22225 1726882763.82487: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 22225 1726882763.82493: in run() - task 0affc7ec-ae25-ec05-55b7-000000000023 22225 1726882763.82531: variable 'ansible_search_path' from source: unknown 22225 1726882763.82535: variable 'ansible_search_path' from source: unknown 22225 1726882763.82629: calling self._execute() 22225 1726882763.82662: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882763.82668: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882763.82678: variable 'omit' from source: magic vars 22225 1726882763.83094: variable 'ansible_distribution_major_version' from source: facts 22225 1726882763.83107: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882763.83285: variable 'network_provider' from source: set_fact 22225 1726882763.83289: variable 'network_state' from source: role '' defaults 22225 1726882763.83306: Evaluated conditional (network_provider == "nm" or network_state != {}): True 22225 1726882763.83312: variable 'omit' from source: magic vars 22225 1726882763.83367: variable 'omit' from source: magic vars 22225 1726882763.83396: variable 'network_service_name' from source: role '' defaults 22225 1726882763.83639: variable 'network_service_name' from source: role '' defaults 22225 1726882763.83643: variable '__network_provider_setup' from source: role '' defaults 22225 1726882763.83645: variable '__network_service_name_default_nm' from source: role '' defaults 22225 1726882763.83675: variable '__network_service_name_default_nm' from source: role '' defaults 22225 1726882763.83685: variable '__network_packages_default_nm' from source: role '' defaults 22225 1726882763.83770: variable '__network_packages_default_nm' from source: role '' defaults 22225 1726882763.84076: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22225 1726882763.86788: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22225 1726882763.86903: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22225 1726882763.87056: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22225 1726882763.87100: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22225 1726882763.87188: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22225 1726882763.87391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22225 1726882763.87441: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22225 1726882763.87468: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22225 1726882763.87672: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22225 1726882763.87689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22225 1726882763.87770: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22225 1726882763.87846: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22225 1726882763.87872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22225 1726882763.87925: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22225 1726882763.87941: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22225 1726882763.88232: variable '__network_packages_default_gobject_packages' from source: role '' defaults 22225 1726882763.88377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22225 1726882763.88410: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22225 1726882763.88438: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22225 1726882763.88487: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22225 1726882763.88528: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22225 1726882763.88612: variable 'ansible_python' from source: facts 22225 1726882763.88637: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 22225 1726882763.88784: variable '__network_wpa_supplicant_required' from source: role '' defaults 22225 1726882763.88829: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 22225 1726882763.88967: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22225 1726882763.88993: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22225 1726882763.89020: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22225 1726882763.89064: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22225 1726882763.89077: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22225 1726882763.89135: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22225 1726882763.89218: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22225 1726882763.89224: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22225 1726882763.89228: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22225 1726882763.89257: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22225 1726882763.89412: variable 'network_connections' from source: task vars 22225 1726882763.89420: variable 'interface' from source: play vars 22225 1726882763.89511: variable 'interface' from source: play vars 22225 1726882763.89641: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22225 1726882763.89928: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22225 1726882763.89932: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22225 1726882763.89982: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22225 1726882763.90039: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22225 1726882763.90115: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22225 1726882763.90153: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22225 1726882763.90191: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22225 1726882763.90238: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22225 1726882763.90287: variable '__network_wireless_connections_defined' from source: role '' defaults 22225 1726882763.90729: variable 'network_connections' from source: task vars 22225 1726882763.90740: variable 'interface' from source: play vars 22225 1726882763.90745: variable 'interface' from source: play vars 22225 1726882763.90791: variable '__network_packages_default_wireless' from source: role '' defaults 22225 1726882763.90887: variable '__network_wireless_connections_defined' from source: role '' defaults 22225 1726882763.91246: variable 'network_connections' from source: task vars 22225 1726882763.91250: variable 'interface' from source: play vars 22225 1726882763.91353: variable 'interface' from source: play vars 22225 1726882763.91365: variable '__network_packages_default_team' from source: role '' defaults 22225 1726882763.91529: variable '__network_team_connections_defined' from source: role '' defaults 22225 1726882763.91810: variable 'network_connections' from source: task vars 22225 1726882763.91813: variable 'interface' from source: play vars 22225 1726882763.91900: variable 'interface' from source: play vars 22225 1726882763.91975: variable '__network_service_name_default_initscripts' from source: role '' defaults 22225 1726882763.92075: variable '__network_service_name_default_initscripts' from source: role '' defaults 22225 1726882763.92079: variable '__network_packages_default_initscripts' from source: role '' defaults 22225 1726882763.92112: variable '__network_packages_default_initscripts' from source: role '' defaults 22225 1726882763.92430: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 22225 1726882763.92958: variable 'network_connections' from source: task vars 22225 1726882763.92961: variable 'interface' from source: play vars 22225 1726882763.93035: variable 'interface' from source: play vars 22225 1726882763.93046: variable 'ansible_distribution' from source: facts 22225 1726882763.93049: variable '__network_rh_distros' from source: role '' defaults 22225 1726882763.93055: variable 'ansible_distribution_major_version' from source: facts 22225 1726882763.93079: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 22225 1726882763.93290: variable 'ansible_distribution' from source: facts 22225 1726882763.93294: variable '__network_rh_distros' from source: role '' defaults 22225 1726882763.93299: variable 'ansible_distribution_major_version' from source: facts 22225 1726882763.93306: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 22225 1726882763.93506: variable 'ansible_distribution' from source: facts 22225 1726882763.93509: variable '__network_rh_distros' from source: role '' defaults 22225 1726882763.93628: variable 'ansible_distribution_major_version' from source: facts 22225 1726882763.93632: variable 'network_provider' from source: set_fact 22225 1726882763.93635: variable 'omit' from source: magic vars 22225 1726882763.93638: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22225 1726882763.93641: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22225 1726882763.93659: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22225 1726882763.93686: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882763.93697: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882763.93730: variable 'inventory_hostname' from source: host vars for 'managed_node1' 22225 1726882763.93734: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882763.93736: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882763.93848: Set connection var ansible_connection to ssh 22225 1726882763.93859: Set connection var ansible_pipelining to False 22225 1726882763.93867: Set connection var ansible_shell_executable to /bin/sh 22225 1726882763.93873: Set connection var ansible_timeout to 10 22225 1726882763.93876: Set connection var ansible_shell_type to sh 22225 1726882763.93885: Set connection var ansible_module_compression to ZIP_DEFLATED 22225 1726882763.93923: variable 'ansible_shell_executable' from source: unknown 22225 1726882763.93926: variable 'ansible_connection' from source: unknown 22225 1726882763.93929: variable 'ansible_module_compression' from source: unknown 22225 1726882763.93931: variable 'ansible_shell_type' from source: unknown 22225 1726882763.93934: variable 'ansible_shell_executable' from source: unknown 22225 1726882763.93936: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882763.93941: variable 'ansible_pipelining' from source: unknown 22225 1726882763.93943: variable 'ansible_timeout' from source: unknown 22225 1726882763.93948: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882763.94069: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22225 1726882763.94235: variable 'omit' from source: magic vars 22225 1726882763.94243: starting attempt loop 22225 1726882763.94245: running the handler 22225 1726882763.94247: variable 'ansible_facts' from source: unknown 22225 1726882763.95370: _low_level_execute_command(): starting 22225 1726882763.95376: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22225 1726882763.96139: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22225 1726882763.96152: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882763.96175: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882763.96197: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882763.96211: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 <<< 22225 1726882763.96277: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882763.96319: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882763.96334: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882763.96352: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882763.96443: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882763.98217: stdout chunk (state=3): >>>/root <<< 22225 1726882763.98425: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882763.98429: stdout chunk (state=3): >>><<< 22225 1726882763.98431: stderr chunk (state=3): >>><<< 22225 1726882763.98529: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882763.98533: _low_level_execute_command(): starting 22225 1726882763.98536: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882763.9845495-22904-182251196303417 `" && echo ansible-tmp-1726882763.9845495-22904-182251196303417="` echo /root/.ansible/tmp/ansible-tmp-1726882763.9845495-22904-182251196303417 `" ) && sleep 0' 22225 1726882763.99138: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22225 1726882763.99152: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882763.99167: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882763.99196: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882763.99300: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882763.99320: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882763.99418: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882764.01435: stdout chunk (state=3): >>>ansible-tmp-1726882763.9845495-22904-182251196303417=/root/.ansible/tmp/ansible-tmp-1726882763.9845495-22904-182251196303417 <<< 22225 1726882764.01630: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882764.01649: stderr chunk (state=3): >>><<< 22225 1726882764.01668: stdout chunk (state=3): >>><<< 22225 1726882764.01749: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882763.9845495-22904-182251196303417=/root/.ansible/tmp/ansible-tmp-1726882763.9845495-22904-182251196303417 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882764.01752: variable 'ansible_module_compression' from source: unknown 22225 1726882764.01800: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 22225 1726882764.01810: ANSIBALLZ: Acquiring lock 22225 1726882764.01818: ANSIBALLZ: Lock acquired: 140272895053888 22225 1726882764.01835: ANSIBALLZ: Creating module 22225 1726882764.52854: ANSIBALLZ: Writing module into payload 22225 1726882764.52970: ANSIBALLZ: Writing module 22225 1726882764.53002: ANSIBALLZ: Renaming module 22225 1726882764.53006: ANSIBALLZ: Done creating module 22225 1726882764.53052: variable 'ansible_facts' from source: unknown 22225 1726882764.53228: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882763.9845495-22904-182251196303417/AnsiballZ_systemd.py 22225 1726882764.53471: Sending initial data 22225 1726882764.53475: Sent initial data (156 bytes) 22225 1726882764.54127: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882764.54145: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882764.54163: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882764.54200: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882764.54399: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882764.56007: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 22225 1726882764.56019: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 22225 1726882764.56031: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 22225 1726882764.56045: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22225 1726882764.56091: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22225 1726882764.56149: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2222502xecypi/tmpsjz6tc76 /root/.ansible/tmp/ansible-tmp-1726882763.9845495-22904-182251196303417/AnsiballZ_systemd.py <<< 22225 1726882764.56156: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882763.9845495-22904-182251196303417/AnsiballZ_systemd.py" <<< 22225 1726882764.56200: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-2222502xecypi/tmpsjz6tc76" to remote "/root/.ansible/tmp/ansible-tmp-1726882763.9845495-22904-182251196303417/AnsiballZ_systemd.py" <<< 22225 1726882764.56208: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882763.9845495-22904-182251196303417/AnsiballZ_systemd.py" <<< 22225 1726882764.58130: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882764.58135: stderr chunk (state=3): >>><<< 22225 1726882764.58139: stdout chunk (state=3): >>><<< 22225 1726882764.58141: done transferring module to remote 22225 1726882764.58143: _low_level_execute_command(): starting 22225 1726882764.58145: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882763.9845495-22904-182251196303417/ /root/.ansible/tmp/ansible-tmp-1726882763.9845495-22904-182251196303417/AnsiballZ_systemd.py && sleep 0' 22225 1726882764.59078: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22225 1726882764.59107: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882764.59111: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882764.59162: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882764.59166: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882764.59232: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882764.61276: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882764.61282: stdout chunk (state=3): >>><<< 22225 1726882764.61284: stderr chunk (state=3): >>><<< 22225 1726882764.61287: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882764.61289: _low_level_execute_command(): starting 22225 1726882764.61292: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882763.9845495-22904-182251196303417/AnsiballZ_systemd.py && sleep 0' 22225 1726882764.61975: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22225 1726882764.61995: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882764.62011: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882764.62043: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882764.62132: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 <<< 22225 1726882764.62145: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882764.62187: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882764.62210: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882764.62253: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882764.62379: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882764.94298: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "678", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ExecMainStartTimestampMonotonic": "28617093", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "678", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3593", "MemoryCurrent": "12005376", "MemoryPeak": "13942784", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3516461056", "CPUUsageNSec": "1633132000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCO<<< 22225 1726882764.94303: stdout chunk (state=3): >>>RE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service multi-user.target shutdown.target network.target cloud-init.service network.service", "After": "basic.target network-pre.target dbus.socket sysinit.target cloud-init-local.service system.slice systemd-journald.socket dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:33:05 EDT", "StateChangeTimestampMonotonic": "343605675", "InactiveExitTimestamp": "Fri 2024-09-20 21:27:50 EDT", "InactiveExitTimestampMonotonic": "28617259", "ActiveEnterTimestamp": "Fri 2024-09-20 21:27:51 EDT", "ActiveEnterTimestampMonotonic": "29575861", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ConditionTimestampMonotonic": "28609732", "AssertTimestamp": "Fri 2024-09-20 21:27:50 EDT", "AssertTimestampMonotonic": "28609736", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "521d937a906d4850835bc71360e9af97", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 22225 1726882764.96146: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. <<< 22225 1726882764.96632: stderr chunk (state=3): >>><<< 22225 1726882764.96636: stdout chunk (state=3): >>><<< 22225 1726882764.96641: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "678", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ExecMainStartTimestampMonotonic": "28617093", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "678", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3593", "MemoryCurrent": "12005376", "MemoryPeak": "13942784", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3516461056", "CPUUsageNSec": "1633132000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service multi-user.target shutdown.target network.target cloud-init.service network.service", "After": "basic.target network-pre.target dbus.socket sysinit.target cloud-init-local.service system.slice systemd-journald.socket dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:33:05 EDT", "StateChangeTimestampMonotonic": "343605675", "InactiveExitTimestamp": "Fri 2024-09-20 21:27:50 EDT", "InactiveExitTimestampMonotonic": "28617259", "ActiveEnterTimestamp": "Fri 2024-09-20 21:27:51 EDT", "ActiveEnterTimestampMonotonic": "29575861", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ConditionTimestampMonotonic": "28609732", "AssertTimestamp": "Fri 2024-09-20 21:27:50 EDT", "AssertTimestampMonotonic": "28609736", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "521d937a906d4850835bc71360e9af97", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. 22225 1726882764.96791: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882763.9845495-22904-182251196303417/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22225 1726882764.96806: _low_level_execute_command(): starting 22225 1726882764.96817: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882763.9845495-22904-182251196303417/ > /dev/null 2>&1 && sleep 0' 22225 1726882764.98240: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882764.98300: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882764.98542: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882764.98563: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882764.98641: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882765.00633: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882765.00636: stdout chunk (state=3): >>><<< 22225 1726882765.00639: stderr chunk (state=3): >>><<< 22225 1726882765.00654: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882765.00667: handler run complete 22225 1726882765.00850: attempt loop complete, returning result 22225 1726882765.00865: _execute() done 22225 1726882765.00872: dumping result to json 22225 1726882765.00901: done dumping result, returning 22225 1726882765.00916: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affc7ec-ae25-ec05-55b7-000000000023] 22225 1726882765.01329: sending task result for task 0affc7ec-ae25-ec05-55b7-000000000023 22225 1726882765.01500: done sending task result for task 0affc7ec-ae25-ec05-55b7-000000000023 22225 1726882765.01503: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 22225 1726882765.01560: no more pending results, returning what we have 22225 1726882765.01564: results queue empty 22225 1726882765.01565: checking for any_errors_fatal 22225 1726882765.01572: done checking for any_errors_fatal 22225 1726882765.01572: checking for max_fail_percentage 22225 1726882765.01574: done checking for max_fail_percentage 22225 1726882765.01575: checking to see if all hosts have failed and the running result is not ok 22225 1726882765.01576: done checking to see if all hosts have failed 22225 1726882765.01577: getting the remaining hosts for this loop 22225 1726882765.01579: done getting the remaining hosts for this loop 22225 1726882765.01586: getting the next task for host managed_node1 22225 1726882765.01593: done getting next task for host managed_node1 22225 1726882765.01598: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 22225 1726882765.01601: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882765.01613: getting variables 22225 1726882765.01615: in VariableManager get_vars() 22225 1726882765.01661: Calling all_inventory to load vars for managed_node1 22225 1726882765.01664: Calling groups_inventory to load vars for managed_node1 22225 1726882765.01666: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882765.01678: Calling all_plugins_play to load vars for managed_node1 22225 1726882765.01684: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882765.01688: Calling groups_plugins_play to load vars for managed_node1 22225 1726882765.05701: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882765.10083: done with get_vars() 22225 1726882765.10117: done getting variables 22225 1726882765.10392: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:39:25 -0400 (0:00:01.290) 0:00:20.496 ****** 22225 1726882765.10432: entering _queue_task() for managed_node1/service 22225 1726882765.11209: worker is 1 (out of 1 available) 22225 1726882765.11226: exiting _queue_task() for managed_node1/service 22225 1726882765.11241: done queuing things up, now waiting for results queue to drain 22225 1726882765.11242: waiting for pending results... 22225 1726882765.12044: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 22225 1726882765.12103: in run() - task 0affc7ec-ae25-ec05-55b7-000000000024 22225 1726882765.12337: variable 'ansible_search_path' from source: unknown 22225 1726882765.12347: variable 'ansible_search_path' from source: unknown 22225 1726882765.12352: calling self._execute() 22225 1726882765.12711: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882765.12715: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882765.12718: variable 'omit' from source: magic vars 22225 1726882765.13555: variable 'ansible_distribution_major_version' from source: facts 22225 1726882765.13829: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882765.13875: variable 'network_provider' from source: set_fact 22225 1726882765.13889: Evaluated conditional (network_provider == "nm"): True 22225 1726882765.14038: variable '__network_wpa_supplicant_required' from source: role '' defaults 22225 1726882765.14291: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 22225 1726882765.14676: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22225 1726882765.19531: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22225 1726882765.19535: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22225 1726882765.19653: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22225 1726882765.19697: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22225 1726882765.19798: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22225 1726882765.20069: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22225 1726882765.20188: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22225 1726882765.20206: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22225 1726882765.20258: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22225 1726882765.20348: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22225 1726882765.20461: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22225 1726882765.20542: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22225 1726882765.20571: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22225 1726882765.20730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22225 1726882765.20756: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22225 1726882765.20928: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22225 1726882765.20952: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22225 1726882765.20989: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22225 1726882765.21077: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22225 1726882765.21145: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22225 1726882765.21606: variable 'network_connections' from source: task vars 22225 1726882765.21609: variable 'interface' from source: play vars 22225 1726882765.21666: variable 'interface' from source: play vars 22225 1726882765.21808: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22225 1726882765.22217: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22225 1726882765.22409: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22225 1726882765.22451: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22225 1726882765.22544: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22225 1726882765.22804: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22225 1726882765.22807: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22225 1726882765.22810: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22225 1726882765.22812: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22225 1726882765.22951: variable '__network_wireless_connections_defined' from source: role '' defaults 22225 1726882765.23628: variable 'network_connections' from source: task vars 22225 1726882765.23632: variable 'interface' from source: play vars 22225 1726882765.23636: variable 'interface' from source: play vars 22225 1726882765.23703: Evaluated conditional (__network_wpa_supplicant_required): False 22225 1726882765.23770: when evaluation is False, skipping this task 22225 1726882765.23778: _execute() done 22225 1726882765.23788: dumping result to json 22225 1726882765.23929: done dumping result, returning 22225 1726882765.23932: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affc7ec-ae25-ec05-55b7-000000000024] 22225 1726882765.23945: sending task result for task 0affc7ec-ae25-ec05-55b7-000000000024 skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 22225 1726882765.24194: no more pending results, returning what we have 22225 1726882765.24198: results queue empty 22225 1726882765.24199: checking for any_errors_fatal 22225 1726882765.24230: done checking for any_errors_fatal 22225 1726882765.24231: checking for max_fail_percentage 22225 1726882765.24234: done checking for max_fail_percentage 22225 1726882765.24235: checking to see if all hosts have failed and the running result is not ok 22225 1726882765.24235: done checking to see if all hosts have failed 22225 1726882765.24236: getting the remaining hosts for this loop 22225 1726882765.24238: done getting the remaining hosts for this loop 22225 1726882765.24243: getting the next task for host managed_node1 22225 1726882765.24251: done getting next task for host managed_node1 22225 1726882765.24255: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 22225 1726882765.24259: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882765.24276: getting variables 22225 1726882765.24278: in VariableManager get_vars() 22225 1726882765.24433: Calling all_inventory to load vars for managed_node1 22225 1726882765.24437: Calling groups_inventory to load vars for managed_node1 22225 1726882765.24439: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882765.24451: Calling all_plugins_play to load vars for managed_node1 22225 1726882765.24454: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882765.24457: Calling groups_plugins_play to load vars for managed_node1 22225 1726882765.25529: done sending task result for task 0affc7ec-ae25-ec05-55b7-000000000024 22225 1726882765.25533: WORKER PROCESS EXITING 22225 1726882765.28209: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882765.32760: done with get_vars() 22225 1726882765.32799: done getting variables 22225 1726882765.32971: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:39:25 -0400 (0:00:00.225) 0:00:20.722 ****** 22225 1726882765.33008: entering _queue_task() for managed_node1/service 22225 1726882765.33869: worker is 1 (out of 1 available) 22225 1726882765.33886: exiting _queue_task() for managed_node1/service 22225 1726882765.33897: done queuing things up, now waiting for results queue to drain 22225 1726882765.33899: waiting for pending results... 22225 1726882765.34155: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 22225 1726882765.34312: in run() - task 0affc7ec-ae25-ec05-55b7-000000000025 22225 1726882765.34337: variable 'ansible_search_path' from source: unknown 22225 1726882765.34345: variable 'ansible_search_path' from source: unknown 22225 1726882765.34393: calling self._execute() 22225 1726882765.34507: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882765.34521: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882765.34539: variable 'omit' from source: magic vars 22225 1726882765.34944: variable 'ansible_distribution_major_version' from source: facts 22225 1726882765.34962: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882765.35092: variable 'network_provider' from source: set_fact 22225 1726882765.35105: Evaluated conditional (network_provider == "initscripts"): False 22225 1726882765.35112: when evaluation is False, skipping this task 22225 1726882765.35119: _execute() done 22225 1726882765.35129: dumping result to json 22225 1726882765.35138: done dumping result, returning 22225 1726882765.35155: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [0affc7ec-ae25-ec05-55b7-000000000025] 22225 1726882765.35168: sending task result for task 0affc7ec-ae25-ec05-55b7-000000000025 skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 22225 1726882765.35334: no more pending results, returning what we have 22225 1726882765.35339: results queue empty 22225 1726882765.35340: checking for any_errors_fatal 22225 1726882765.35349: done checking for any_errors_fatal 22225 1726882765.35350: checking for max_fail_percentage 22225 1726882765.35351: done checking for max_fail_percentage 22225 1726882765.35352: checking to see if all hosts have failed and the running result is not ok 22225 1726882765.35353: done checking to see if all hosts have failed 22225 1726882765.35354: getting the remaining hosts for this loop 22225 1726882765.35355: done getting the remaining hosts for this loop 22225 1726882765.35360: getting the next task for host managed_node1 22225 1726882765.35367: done getting next task for host managed_node1 22225 1726882765.35370: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 22225 1726882765.35375: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882765.35398: getting variables 22225 1726882765.35400: in VariableManager get_vars() 22225 1726882765.35445: Calling all_inventory to load vars for managed_node1 22225 1726882765.35448: Calling groups_inventory to load vars for managed_node1 22225 1726882765.35450: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882765.35466: Calling all_plugins_play to load vars for managed_node1 22225 1726882765.35469: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882765.35472: Calling groups_plugins_play to load vars for managed_node1 22225 1726882765.35486: done sending task result for task 0affc7ec-ae25-ec05-55b7-000000000025 22225 1726882765.35489: WORKER PROCESS EXITING 22225 1726882765.39273: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882765.43650: done with get_vars() 22225 1726882765.43686: done getting variables 22225 1726882765.43752: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:39:25 -0400 (0:00:00.107) 0:00:20.830 ****** 22225 1726882765.43790: entering _queue_task() for managed_node1/copy 22225 1726882765.44555: worker is 1 (out of 1 available) 22225 1726882765.44570: exiting _queue_task() for managed_node1/copy 22225 1726882765.44585: done queuing things up, now waiting for results queue to drain 22225 1726882765.44586: waiting for pending results... 22225 1726882765.45342: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 22225 1726882765.45729: in run() - task 0affc7ec-ae25-ec05-55b7-000000000026 22225 1726882765.45734: variable 'ansible_search_path' from source: unknown 22225 1726882765.45737: variable 'ansible_search_path' from source: unknown 22225 1726882765.45740: calling self._execute() 22225 1726882765.45742: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882765.45746: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882765.45749: variable 'omit' from source: magic vars 22225 1726882765.46503: variable 'ansible_distribution_major_version' from source: facts 22225 1726882765.46729: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882765.46828: variable 'network_provider' from source: set_fact 22225 1726882765.46914: Evaluated conditional (network_provider == "initscripts"): False 22225 1726882765.46924: when evaluation is False, skipping this task 22225 1726882765.46931: _execute() done 22225 1726882765.46939: dumping result to json 22225 1726882765.46946: done dumping result, returning 22225 1726882765.46960: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affc7ec-ae25-ec05-55b7-000000000026] 22225 1726882765.46970: sending task result for task 0affc7ec-ae25-ec05-55b7-000000000026 skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 22225 1726882765.47148: no more pending results, returning what we have 22225 1726882765.47153: results queue empty 22225 1726882765.47154: checking for any_errors_fatal 22225 1726882765.47163: done checking for any_errors_fatal 22225 1726882765.47163: checking for max_fail_percentage 22225 1726882765.47165: done checking for max_fail_percentage 22225 1726882765.47166: checking to see if all hosts have failed and the running result is not ok 22225 1726882765.47167: done checking to see if all hosts have failed 22225 1726882765.47168: getting the remaining hosts for this loop 22225 1726882765.47169: done getting the remaining hosts for this loop 22225 1726882765.47174: getting the next task for host managed_node1 22225 1726882765.47183: done getting next task for host managed_node1 22225 1726882765.47188: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 22225 1726882765.47192: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882765.47213: getting variables 22225 1726882765.47215: in VariableManager get_vars() 22225 1726882765.47261: Calling all_inventory to load vars for managed_node1 22225 1726882765.47264: Calling groups_inventory to load vars for managed_node1 22225 1726882765.47266: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882765.47284: Calling all_plugins_play to load vars for managed_node1 22225 1726882765.47287: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882765.47291: Calling groups_plugins_play to load vars for managed_node1 22225 1726882765.47813: done sending task result for task 0affc7ec-ae25-ec05-55b7-000000000026 22225 1726882765.47817: WORKER PROCESS EXITING 22225 1726882765.49236: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882765.51262: done with get_vars() 22225 1726882765.51298: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:39:25 -0400 (0:00:00.077) 0:00:20.908 ****** 22225 1726882765.51582: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 22225 1726882765.51584: Creating lock for fedora.linux_system_roles.network_connections 22225 1726882765.52319: worker is 1 (out of 1 available) 22225 1726882765.52338: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 22225 1726882765.52352: done queuing things up, now waiting for results queue to drain 22225 1726882765.52353: waiting for pending results... 22225 1726882765.52777: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 22225 1726882765.53131: in run() - task 0affc7ec-ae25-ec05-55b7-000000000027 22225 1726882765.53135: variable 'ansible_search_path' from source: unknown 22225 1726882765.53138: variable 'ansible_search_path' from source: unknown 22225 1726882765.53141: calling self._execute() 22225 1726882765.53144: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882765.53147: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882765.53155: variable 'omit' from source: magic vars 22225 1726882765.53630: variable 'ansible_distribution_major_version' from source: facts 22225 1726882765.53651: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882765.53664: variable 'omit' from source: magic vars 22225 1726882765.53730: variable 'omit' from source: magic vars 22225 1726882765.54004: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22225 1726882765.56840: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22225 1726882765.57103: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22225 1726882765.57150: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22225 1726882765.57190: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22225 1726882765.57258: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22225 1726882765.57434: variable 'network_provider' from source: set_fact 22225 1726882765.57790: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22225 1726882765.57888: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22225 1726882765.57921: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22225 1726882765.58024: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22225 1726882765.58106: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22225 1726882765.58270: variable 'omit' from source: magic vars 22225 1726882765.58467: variable 'omit' from source: magic vars 22225 1726882765.58755: variable 'network_connections' from source: task vars 22225 1726882765.58759: variable 'interface' from source: play vars 22225 1726882765.58762: variable 'interface' from source: play vars 22225 1726882765.58861: variable 'omit' from source: magic vars 22225 1726882765.58870: variable '__lsr_ansible_managed' from source: task vars 22225 1726882765.58938: variable '__lsr_ansible_managed' from source: task vars 22225 1726882765.59263: Loaded config def from plugin (lookup/template) 22225 1726882765.59267: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 22225 1726882765.59304: File lookup term: get_ansible_managed.j2 22225 1726882765.59307: variable 'ansible_search_path' from source: unknown 22225 1726882765.59319: evaluation_path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 22225 1726882765.59326: search_path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 22225 1726882765.59338: variable 'ansible_search_path' from source: unknown 22225 1726882765.66755: variable 'ansible_managed' from source: unknown 22225 1726882765.66910: variable 'omit' from source: magic vars 22225 1726882765.66943: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22225 1726882765.66973: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22225 1726882765.66995: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22225 1726882765.67048: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882765.67052: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882765.67057: variable 'inventory_hostname' from source: host vars for 'managed_node1' 22225 1726882765.67062: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882765.67067: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882765.67157: Set connection var ansible_connection to ssh 22225 1726882765.67226: Set connection var ansible_pipelining to False 22225 1726882765.67230: Set connection var ansible_shell_executable to /bin/sh 22225 1726882765.67232: Set connection var ansible_timeout to 10 22225 1726882765.67235: Set connection var ansible_shell_type to sh 22225 1726882765.67238: Set connection var ansible_module_compression to ZIP_DEFLATED 22225 1726882765.67240: variable 'ansible_shell_executable' from source: unknown 22225 1726882765.67242: variable 'ansible_connection' from source: unknown 22225 1726882765.67244: variable 'ansible_module_compression' from source: unknown 22225 1726882765.67246: variable 'ansible_shell_type' from source: unknown 22225 1726882765.67248: variable 'ansible_shell_executable' from source: unknown 22225 1726882765.67250: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882765.67252: variable 'ansible_pipelining' from source: unknown 22225 1726882765.67254: variable 'ansible_timeout' from source: unknown 22225 1726882765.67257: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882765.67371: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 22225 1726882765.67386: variable 'omit' from source: magic vars 22225 1726882765.67389: starting attempt loop 22225 1726882765.67391: running the handler 22225 1726882765.67492: _low_level_execute_command(): starting 22225 1726882765.67495: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22225 1726882765.68201: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22225 1726882765.68213: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882765.68226: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882765.68244: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882765.68259: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 <<< 22225 1726882765.68262: stderr chunk (state=3): >>>debug2: match not found <<< 22225 1726882765.68272: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882765.68290: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 22225 1726882765.68298: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.7 is address <<< 22225 1726882765.68305: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 22225 1726882765.68313: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882765.68326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882765.68339: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882765.68426: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882765.68435: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882765.68449: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882765.68536: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882765.70333: stdout chunk (state=3): >>>/root <<< 22225 1726882765.70540: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882765.70544: stdout chunk (state=3): >>><<< 22225 1726882765.70574: stderr chunk (state=3): >>><<< 22225 1726882765.70594: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882765.70627: _low_level_execute_command(): starting 22225 1726882765.70631: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882765.705953-22963-156178569264740 `" && echo ansible-tmp-1726882765.705953-22963-156178569264740="` echo /root/.ansible/tmp/ansible-tmp-1726882765.705953-22963-156178569264740 `" ) && sleep 0' 22225 1726882765.71271: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882765.71299: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882765.71304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882765.71313: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882765.71363: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882765.71371: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882765.71373: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882765.71425: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882765.73407: stdout chunk (state=3): >>>ansible-tmp-1726882765.705953-22963-156178569264740=/root/.ansible/tmp/ansible-tmp-1726882765.705953-22963-156178569264740 <<< 22225 1726882765.73510: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882765.73566: stderr chunk (state=3): >>><<< 22225 1726882765.73568: stdout chunk (state=3): >>><<< 22225 1726882765.73580: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882765.705953-22963-156178569264740=/root/.ansible/tmp/ansible-tmp-1726882765.705953-22963-156178569264740 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882765.73632: variable 'ansible_module_compression' from source: unknown 22225 1726882765.73677: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 22225 1726882765.73683: ANSIBALLZ: Acquiring lock 22225 1726882765.73687: ANSIBALLZ: Lock acquired: 140272893346064 22225 1726882765.73689: ANSIBALLZ: Creating module 22225 1726882765.92085: ANSIBALLZ: Writing module into payload 22225 1726882765.92319: ANSIBALLZ: Writing module 22225 1726882765.92345: ANSIBALLZ: Renaming module 22225 1726882765.92352: ANSIBALLZ: Done creating module 22225 1726882765.92374: variable 'ansible_facts' from source: unknown 22225 1726882765.92444: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882765.705953-22963-156178569264740/AnsiballZ_network_connections.py 22225 1726882765.92554: Sending initial data 22225 1726882765.92558: Sent initial data (167 bytes) 22225 1726882765.93021: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882765.93056: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882765.93059: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found <<< 22225 1726882765.93062: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882765.93064: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882765.93066: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found <<< 22225 1726882765.93068: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882765.93126: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882765.93136: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882765.93139: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882765.93192: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882765.94933: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 22225 1726882765.94937: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22225 1726882765.94990: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22225 1726882765.95057: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2222502xecypi/tmpxstkvvpg /root/.ansible/tmp/ansible-tmp-1726882765.705953-22963-156178569264740/AnsiballZ_network_connections.py <<< 22225 1726882765.95061: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882765.705953-22963-156178569264740/AnsiballZ_network_connections.py" <<< 22225 1726882765.95116: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-2222502xecypi/tmpxstkvvpg" to remote "/root/.ansible/tmp/ansible-tmp-1726882765.705953-22963-156178569264740/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882765.705953-22963-156178569264740/AnsiballZ_network_connections.py" <<< 22225 1726882765.96231: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882765.96348: stderr chunk (state=3): >>><<< 22225 1726882765.96351: stdout chunk (state=3): >>><<< 22225 1726882765.96354: done transferring module to remote 22225 1726882765.96356: _low_level_execute_command(): starting 22225 1726882765.96358: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882765.705953-22963-156178569264740/ /root/.ansible/tmp/ansible-tmp-1726882765.705953-22963-156178569264740/AnsiballZ_network_connections.py && sleep 0' 22225 1726882765.96887: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882765.96891: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882765.96919: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found <<< 22225 1726882765.96925: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882765.96927: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882765.96976: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882765.96980: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882765.97041: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882765.98871: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882765.98911: stderr chunk (state=3): >>><<< 22225 1726882765.98915: stdout chunk (state=3): >>><<< 22225 1726882765.98932: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882765.98936: _low_level_execute_command(): starting 22225 1726882765.98938: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882765.705953-22963-156178569264740/AnsiballZ_network_connections.py && sleep 0' 22225 1726882765.99363: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882765.99366: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found <<< 22225 1726882765.99374: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address <<< 22225 1726882765.99376: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882765.99379: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882765.99421: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882765.99428: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882765.99487: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882768.32072: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, b7f0538a-9cdb-4097-80b7-66d6eec65d0d\n[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, b7f0538a-9cdb-4097-80b7-66d6eec65d0d (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "type": "ethernet", "state": "up", "ip": {"dhcp4": false, "auto6": false, "address": ["2001:db8::2/32", "2001:db8::3/32", "2001:db8::4/32"], "gateway6": "2001:db8::1"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "type": "ethernet", "state": "up", "ip": {"dhcp4": false, "auto6": false, "address": ["2001:db8::2/32", "2001:db8::3/32", "2001:db8::4/32"], "gateway6": "2001:db8::1"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 22225 1726882768.34036: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. <<< 22225 1726882768.34098: stderr chunk (state=3): >>><<< 22225 1726882768.34101: stdout chunk (state=3): >>><<< 22225 1726882768.34117: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, b7f0538a-9cdb-4097-80b7-66d6eec65d0d\n[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, b7f0538a-9cdb-4097-80b7-66d6eec65d0d (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "type": "ethernet", "state": "up", "ip": {"dhcp4": false, "auto6": false, "address": ["2001:db8::2/32", "2001:db8::3/32", "2001:db8::4/32"], "gateway6": "2001:db8::1"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "type": "ethernet", "state": "up", "ip": {"dhcp4": false, "auto6": false, "address": ["2001:db8::2/32", "2001:db8::3/32", "2001:db8::4/32"], "gateway6": "2001:db8::1"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. 22225 1726882768.34159: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'veth0', 'type': 'ethernet', 'state': 'up', 'ip': {'dhcp4': False, 'auto6': False, 'address': ['2001:db8::2/32', '2001:db8::3/32', '2001:db8::4/32'], 'gateway6': '2001:db8::1'}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882765.705953-22963-156178569264740/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22225 1726882768.34167: _low_level_execute_command(): starting 22225 1726882768.34173: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882765.705953-22963-156178569264740/ > /dev/null 2>&1 && sleep 0' 22225 1726882768.34654: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882768.34658: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found <<< 22225 1726882768.34662: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882768.34664: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882768.34667: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882768.34721: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882768.34731: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882768.34733: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882768.34783: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882768.36928: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882768.36932: stderr chunk (state=3): >>><<< 22225 1726882768.36934: stdout chunk (state=3): >>><<< 22225 1726882768.36937: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882768.36946: handler run complete 22225 1726882768.36949: attempt loop complete, returning result 22225 1726882768.36952: _execute() done 22225 1726882768.36955: dumping result to json 22225 1726882768.36958: done dumping result, returning 22225 1726882768.36961: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affc7ec-ae25-ec05-55b7-000000000027] 22225 1726882768.36964: sending task result for task 0affc7ec-ae25-ec05-55b7-000000000027 22225 1726882768.37083: done sending task result for task 0affc7ec-ae25-ec05-55b7-000000000027 22225 1726882768.37086: WORKER PROCESS EXITING changed: [managed_node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "address": [ "2001:db8::2/32", "2001:db8::3/32", "2001:db8::4/32" ], "auto6": false, "dhcp4": false, "gateway6": "2001:db8::1" }, "name": "veth0", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [003] #0, state:up persistent_state:present, 'veth0': add connection veth0, b7f0538a-9cdb-4097-80b7-66d6eec65d0d [004] #0, state:up persistent_state:present, 'veth0': up connection veth0, b7f0538a-9cdb-4097-80b7-66d6eec65d0d (not-active) 22225 1726882768.37404: no more pending results, returning what we have 22225 1726882768.37407: results queue empty 22225 1726882768.37408: checking for any_errors_fatal 22225 1726882768.37413: done checking for any_errors_fatal 22225 1726882768.37414: checking for max_fail_percentage 22225 1726882768.37416: done checking for max_fail_percentage 22225 1726882768.37417: checking to see if all hosts have failed and the running result is not ok 22225 1726882768.37418: done checking to see if all hosts have failed 22225 1726882768.37418: getting the remaining hosts for this loop 22225 1726882768.37420: done getting the remaining hosts for this loop 22225 1726882768.37425: getting the next task for host managed_node1 22225 1726882768.37430: done getting next task for host managed_node1 22225 1726882768.37434: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 22225 1726882768.37437: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882768.37666: getting variables 22225 1726882768.37668: in VariableManager get_vars() 22225 1726882768.37711: Calling all_inventory to load vars for managed_node1 22225 1726882768.37714: Calling groups_inventory to load vars for managed_node1 22225 1726882768.37716: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882768.37729: Calling all_plugins_play to load vars for managed_node1 22225 1726882768.37732: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882768.37742: Calling groups_plugins_play to load vars for managed_node1 22225 1726882768.40430: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882768.42748: done with get_vars() 22225 1726882768.42778: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:39:28 -0400 (0:00:02.912) 0:00:23.821 ****** 22225 1726882768.42875: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 22225 1726882768.42877: Creating lock for fedora.linux_system_roles.network_state 22225 1726882768.43235: worker is 1 (out of 1 available) 22225 1726882768.43251: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 22225 1726882768.43265: done queuing things up, now waiting for results queue to drain 22225 1726882768.43267: waiting for pending results... 22225 1726882768.43567: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 22225 1726882768.43717: in run() - task 0affc7ec-ae25-ec05-55b7-000000000028 22225 1726882768.43738: variable 'ansible_search_path' from source: unknown 22225 1726882768.43742: variable 'ansible_search_path' from source: unknown 22225 1726882768.43790: calling self._execute() 22225 1726882768.43863: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882768.43869: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882768.43878: variable 'omit' from source: magic vars 22225 1726882768.44182: variable 'ansible_distribution_major_version' from source: facts 22225 1726882768.44189: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882768.44278: variable 'network_state' from source: role '' defaults 22225 1726882768.44287: Evaluated conditional (network_state != {}): False 22225 1726882768.44290: when evaluation is False, skipping this task 22225 1726882768.44299: _execute() done 22225 1726882768.44302: dumping result to json 22225 1726882768.44305: done dumping result, returning 22225 1726882768.44308: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [0affc7ec-ae25-ec05-55b7-000000000028] 22225 1726882768.44311: sending task result for task 0affc7ec-ae25-ec05-55b7-000000000028 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 22225 1726882768.44463: no more pending results, returning what we have 22225 1726882768.44468: results queue empty 22225 1726882768.44469: checking for any_errors_fatal 22225 1726882768.44483: done checking for any_errors_fatal 22225 1726882768.44484: checking for max_fail_percentage 22225 1726882768.44485: done checking for max_fail_percentage 22225 1726882768.44486: checking to see if all hosts have failed and the running result is not ok 22225 1726882768.44487: done checking to see if all hosts have failed 22225 1726882768.44488: getting the remaining hosts for this loop 22225 1726882768.44490: done getting the remaining hosts for this loop 22225 1726882768.44493: getting the next task for host managed_node1 22225 1726882768.44499: done getting next task for host managed_node1 22225 1726882768.44503: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 22225 1726882768.44506: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882768.44520: getting variables 22225 1726882768.44530: in VariableManager get_vars() 22225 1726882768.44564: Calling all_inventory to load vars for managed_node1 22225 1726882768.44567: Calling groups_inventory to load vars for managed_node1 22225 1726882768.44569: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882768.44575: done sending task result for task 0affc7ec-ae25-ec05-55b7-000000000028 22225 1726882768.44577: WORKER PROCESS EXITING 22225 1726882768.44589: Calling all_plugins_play to load vars for managed_node1 22225 1726882768.44592: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882768.44595: Calling groups_plugins_play to load vars for managed_node1 22225 1726882768.45677: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882768.47320: done with get_vars() 22225 1726882768.47340: done getting variables 22225 1726882768.47390: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:39:28 -0400 (0:00:00.045) 0:00:23.866 ****** 22225 1726882768.47414: entering _queue_task() for managed_node1/debug 22225 1726882768.47654: worker is 1 (out of 1 available) 22225 1726882768.47669: exiting _queue_task() for managed_node1/debug 22225 1726882768.47682: done queuing things up, now waiting for results queue to drain 22225 1726882768.47684: waiting for pending results... 22225 1726882768.47864: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 22225 1726882768.47962: in run() - task 0affc7ec-ae25-ec05-55b7-000000000029 22225 1726882768.47974: variable 'ansible_search_path' from source: unknown 22225 1726882768.47978: variable 'ansible_search_path' from source: unknown 22225 1726882768.48009: calling self._execute() 22225 1726882768.48086: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882768.48091: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882768.48099: variable 'omit' from source: magic vars 22225 1726882768.48385: variable 'ansible_distribution_major_version' from source: facts 22225 1726882768.48394: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882768.48400: variable 'omit' from source: magic vars 22225 1726882768.48440: variable 'omit' from source: magic vars 22225 1726882768.48469: variable 'omit' from source: magic vars 22225 1726882768.48502: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22225 1726882768.48531: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22225 1726882768.48547: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22225 1726882768.48562: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882768.48573: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882768.48599: variable 'inventory_hostname' from source: host vars for 'managed_node1' 22225 1726882768.48603: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882768.48605: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882768.48675: Set connection var ansible_connection to ssh 22225 1726882768.48689: Set connection var ansible_pipelining to False 22225 1726882768.48692: Set connection var ansible_shell_executable to /bin/sh 22225 1726882768.48695: Set connection var ansible_timeout to 10 22225 1726882768.48697: Set connection var ansible_shell_type to sh 22225 1726882768.48704: Set connection var ansible_module_compression to ZIP_DEFLATED 22225 1726882768.48725: variable 'ansible_shell_executable' from source: unknown 22225 1726882768.48728: variable 'ansible_connection' from source: unknown 22225 1726882768.48731: variable 'ansible_module_compression' from source: unknown 22225 1726882768.48733: variable 'ansible_shell_type' from source: unknown 22225 1726882768.48736: variable 'ansible_shell_executable' from source: unknown 22225 1726882768.48738: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882768.48741: variable 'ansible_pipelining' from source: unknown 22225 1726882768.48744: variable 'ansible_timeout' from source: unknown 22225 1726882768.48749: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882768.48859: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22225 1726882768.48868: variable 'omit' from source: magic vars 22225 1726882768.48874: starting attempt loop 22225 1726882768.48877: running the handler 22225 1726882768.48970: variable '__network_connections_result' from source: set_fact 22225 1726882768.49015: handler run complete 22225 1726882768.49032: attempt loop complete, returning result 22225 1726882768.49036: _execute() done 22225 1726882768.49038: dumping result to json 22225 1726882768.49041: done dumping result, returning 22225 1726882768.49050: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affc7ec-ae25-ec05-55b7-000000000029] 22225 1726882768.49054: sending task result for task 0affc7ec-ae25-ec05-55b7-000000000029 ok: [managed_node1] => { "__network_connections_result.stderr_lines": [ "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, b7f0538a-9cdb-4097-80b7-66d6eec65d0d", "[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, b7f0538a-9cdb-4097-80b7-66d6eec65d0d (not-active)" ] } 22225 1726882768.49214: no more pending results, returning what we have 22225 1726882768.49217: results queue empty 22225 1726882768.49218: checking for any_errors_fatal 22225 1726882768.49225: done checking for any_errors_fatal 22225 1726882768.49226: checking for max_fail_percentage 22225 1726882768.49227: done checking for max_fail_percentage 22225 1726882768.49228: checking to see if all hosts have failed and the running result is not ok 22225 1726882768.49229: done checking to see if all hosts have failed 22225 1726882768.49230: getting the remaining hosts for this loop 22225 1726882768.49231: done getting the remaining hosts for this loop 22225 1726882768.49237: getting the next task for host managed_node1 22225 1726882768.49242: done getting next task for host managed_node1 22225 1726882768.49246: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 22225 1726882768.49249: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882768.49258: getting variables 22225 1726882768.49260: in VariableManager get_vars() 22225 1726882768.49296: Calling all_inventory to load vars for managed_node1 22225 1726882768.49298: Calling groups_inventory to load vars for managed_node1 22225 1726882768.49300: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882768.49309: Calling all_plugins_play to load vars for managed_node1 22225 1726882768.49311: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882768.49314: Calling groups_plugins_play to load vars for managed_node1 22225 1726882768.49838: done sending task result for task 0affc7ec-ae25-ec05-55b7-000000000029 22225 1726882768.49842: WORKER PROCESS EXITING 22225 1726882768.50375: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882768.51553: done with get_vars() 22225 1726882768.51571: done getting variables 22225 1726882768.51619: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:39:28 -0400 (0:00:00.042) 0:00:23.909 ****** 22225 1726882768.51644: entering _queue_task() for managed_node1/debug 22225 1726882768.51868: worker is 1 (out of 1 available) 22225 1726882768.51885: exiting _queue_task() for managed_node1/debug 22225 1726882768.51896: done queuing things up, now waiting for results queue to drain 22225 1726882768.51898: waiting for pending results... 22225 1726882768.52071: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 22225 1726882768.52166: in run() - task 0affc7ec-ae25-ec05-55b7-00000000002a 22225 1726882768.52178: variable 'ansible_search_path' from source: unknown 22225 1726882768.52185: variable 'ansible_search_path' from source: unknown 22225 1726882768.52212: calling self._execute() 22225 1726882768.52286: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882768.52292: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882768.52299: variable 'omit' from source: magic vars 22225 1726882768.52575: variable 'ansible_distribution_major_version' from source: facts 22225 1726882768.52586: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882768.52593: variable 'omit' from source: magic vars 22225 1726882768.52634: variable 'omit' from source: magic vars 22225 1726882768.52661: variable 'omit' from source: magic vars 22225 1726882768.52694: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22225 1726882768.52724: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22225 1726882768.52738: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22225 1726882768.52753: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882768.52763: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882768.52790: variable 'inventory_hostname' from source: host vars for 'managed_node1' 22225 1726882768.52793: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882768.52795: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882768.52863: Set connection var ansible_connection to ssh 22225 1726882768.52872: Set connection var ansible_pipelining to False 22225 1726882768.52883: Set connection var ansible_shell_executable to /bin/sh 22225 1726882768.52886: Set connection var ansible_timeout to 10 22225 1726882768.52888: Set connection var ansible_shell_type to sh 22225 1726882768.52900: Set connection var ansible_module_compression to ZIP_DEFLATED 22225 1726882768.52916: variable 'ansible_shell_executable' from source: unknown 22225 1726882768.52919: variable 'ansible_connection' from source: unknown 22225 1726882768.52924: variable 'ansible_module_compression' from source: unknown 22225 1726882768.52926: variable 'ansible_shell_type' from source: unknown 22225 1726882768.52929: variable 'ansible_shell_executable' from source: unknown 22225 1726882768.52931: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882768.52936: variable 'ansible_pipelining' from source: unknown 22225 1726882768.52938: variable 'ansible_timeout' from source: unknown 22225 1726882768.52943: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882768.53048: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22225 1726882768.53058: variable 'omit' from source: magic vars 22225 1726882768.53064: starting attempt loop 22225 1726882768.53067: running the handler 22225 1726882768.53104: variable '__network_connections_result' from source: set_fact 22225 1726882768.53162: variable '__network_connections_result' from source: set_fact 22225 1726882768.53257: handler run complete 22225 1726882768.53277: attempt loop complete, returning result 22225 1726882768.53282: _execute() done 22225 1726882768.53286: dumping result to json 22225 1726882768.53288: done dumping result, returning 22225 1726882768.53295: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affc7ec-ae25-ec05-55b7-00000000002a] 22225 1726882768.53299: sending task result for task 0affc7ec-ae25-ec05-55b7-00000000002a 22225 1726882768.53398: done sending task result for task 0affc7ec-ae25-ec05-55b7-00000000002a 22225 1726882768.53401: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "address": [ "2001:db8::2/32", "2001:db8::3/32", "2001:db8::4/32" ], "auto6": false, "dhcp4": false, "gateway6": "2001:db8::1" }, "name": "veth0", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, b7f0538a-9cdb-4097-80b7-66d6eec65d0d\n[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, b7f0538a-9cdb-4097-80b7-66d6eec65d0d (not-active)\n", "stderr_lines": [ "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, b7f0538a-9cdb-4097-80b7-66d6eec65d0d", "[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, b7f0538a-9cdb-4097-80b7-66d6eec65d0d (not-active)" ] } } 22225 1726882768.53496: no more pending results, returning what we have 22225 1726882768.53499: results queue empty 22225 1726882768.53500: checking for any_errors_fatal 22225 1726882768.53505: done checking for any_errors_fatal 22225 1726882768.53505: checking for max_fail_percentage 22225 1726882768.53507: done checking for max_fail_percentage 22225 1726882768.53508: checking to see if all hosts have failed and the running result is not ok 22225 1726882768.53509: done checking to see if all hosts have failed 22225 1726882768.53509: getting the remaining hosts for this loop 22225 1726882768.53512: done getting the remaining hosts for this loop 22225 1726882768.53515: getting the next task for host managed_node1 22225 1726882768.53520: done getting next task for host managed_node1 22225 1726882768.53531: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 22225 1726882768.53534: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882768.53543: getting variables 22225 1726882768.53545: in VariableManager get_vars() 22225 1726882768.53574: Calling all_inventory to load vars for managed_node1 22225 1726882768.53581: Calling groups_inventory to load vars for managed_node1 22225 1726882768.53583: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882768.53590: Calling all_plugins_play to load vars for managed_node1 22225 1726882768.53592: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882768.53594: Calling groups_plugins_play to load vars for managed_node1 22225 1726882768.54518: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882768.55695: done with get_vars() 22225 1726882768.55713: done getting variables 22225 1726882768.55755: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:39:28 -0400 (0:00:00.041) 0:00:23.950 ****** 22225 1726882768.55777: entering _queue_task() for managed_node1/debug 22225 1726882768.55992: worker is 1 (out of 1 available) 22225 1726882768.56008: exiting _queue_task() for managed_node1/debug 22225 1726882768.56019: done queuing things up, now waiting for results queue to drain 22225 1726882768.56020: waiting for pending results... 22225 1726882768.56190: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 22225 1726882768.56277: in run() - task 0affc7ec-ae25-ec05-55b7-00000000002b 22225 1726882768.56289: variable 'ansible_search_path' from source: unknown 22225 1726882768.56292: variable 'ansible_search_path' from source: unknown 22225 1726882768.56320: calling self._execute() 22225 1726882768.56402: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882768.56406: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882768.56416: variable 'omit' from source: magic vars 22225 1726882768.56701: variable 'ansible_distribution_major_version' from source: facts 22225 1726882768.56711: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882768.56797: variable 'network_state' from source: role '' defaults 22225 1726882768.56807: Evaluated conditional (network_state != {}): False 22225 1726882768.56810: when evaluation is False, skipping this task 22225 1726882768.56813: _execute() done 22225 1726882768.56815: dumping result to json 22225 1726882768.56818: done dumping result, returning 22225 1726882768.56828: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affc7ec-ae25-ec05-55b7-00000000002b] 22225 1726882768.56833: sending task result for task 0affc7ec-ae25-ec05-55b7-00000000002b 22225 1726882768.56925: done sending task result for task 0affc7ec-ae25-ec05-55b7-00000000002b 22225 1726882768.56929: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "network_state != {}" } 22225 1726882768.56975: no more pending results, returning what we have 22225 1726882768.56978: results queue empty 22225 1726882768.56979: checking for any_errors_fatal 22225 1726882768.56985: done checking for any_errors_fatal 22225 1726882768.56986: checking for max_fail_percentage 22225 1726882768.56987: done checking for max_fail_percentage 22225 1726882768.56988: checking to see if all hosts have failed and the running result is not ok 22225 1726882768.56989: done checking to see if all hosts have failed 22225 1726882768.56990: getting the remaining hosts for this loop 22225 1726882768.56991: done getting the remaining hosts for this loop 22225 1726882768.56994: getting the next task for host managed_node1 22225 1726882768.56999: done getting next task for host managed_node1 22225 1726882768.57004: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 22225 1726882768.57007: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882768.57020: getting variables 22225 1726882768.57023: in VariableManager get_vars() 22225 1726882768.57062: Calling all_inventory to load vars for managed_node1 22225 1726882768.57064: Calling groups_inventory to load vars for managed_node1 22225 1726882768.57066: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882768.57075: Calling all_plugins_play to load vars for managed_node1 22225 1726882768.57077: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882768.57080: Calling groups_plugins_play to load vars for managed_node1 22225 1726882768.58084: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882768.59246: done with get_vars() 22225 1726882768.59263: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:39:28 -0400 (0:00:00.035) 0:00:23.985 ****** 22225 1726882768.59332: entering _queue_task() for managed_node1/ping 22225 1726882768.59333: Creating lock for ping 22225 1726882768.59550: worker is 1 (out of 1 available) 22225 1726882768.59566: exiting _queue_task() for managed_node1/ping 22225 1726882768.59578: done queuing things up, now waiting for results queue to drain 22225 1726882768.59579: waiting for pending results... 22225 1726882768.59759: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 22225 1726882768.59855: in run() - task 0affc7ec-ae25-ec05-55b7-00000000002c 22225 1726882768.59867: variable 'ansible_search_path' from source: unknown 22225 1726882768.59870: variable 'ansible_search_path' from source: unknown 22225 1726882768.59901: calling self._execute() 22225 1726882768.59973: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882768.59977: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882768.59989: variable 'omit' from source: magic vars 22225 1726882768.60274: variable 'ansible_distribution_major_version' from source: facts 22225 1726882768.60286: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882768.60292: variable 'omit' from source: magic vars 22225 1726882768.60333: variable 'omit' from source: magic vars 22225 1726882768.60360: variable 'omit' from source: magic vars 22225 1726882768.60394: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22225 1726882768.60425: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22225 1726882768.60441: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22225 1726882768.60459: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882768.60468: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882768.60494: variable 'inventory_hostname' from source: host vars for 'managed_node1' 22225 1726882768.60498: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882768.60501: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882768.60571: Set connection var ansible_connection to ssh 22225 1726882768.60581: Set connection var ansible_pipelining to False 22225 1726882768.60590: Set connection var ansible_shell_executable to /bin/sh 22225 1726882768.60596: Set connection var ansible_timeout to 10 22225 1726882768.60598: Set connection var ansible_shell_type to sh 22225 1726882768.60604: Set connection var ansible_module_compression to ZIP_DEFLATED 22225 1726882768.60623: variable 'ansible_shell_executable' from source: unknown 22225 1726882768.60626: variable 'ansible_connection' from source: unknown 22225 1726882768.60630: variable 'ansible_module_compression' from source: unknown 22225 1726882768.60632: variable 'ansible_shell_type' from source: unknown 22225 1726882768.60634: variable 'ansible_shell_executable' from source: unknown 22225 1726882768.60637: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882768.60642: variable 'ansible_pipelining' from source: unknown 22225 1726882768.60644: variable 'ansible_timeout' from source: unknown 22225 1726882768.60648: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882768.60812: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 22225 1726882768.60820: variable 'omit' from source: magic vars 22225 1726882768.60827: starting attempt loop 22225 1726882768.60830: running the handler 22225 1726882768.60843: _low_level_execute_command(): starting 22225 1726882768.60851: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22225 1726882768.61387: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882768.61391: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882768.61394: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882768.61396: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882768.61454: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882768.61458: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882768.61462: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882768.61521: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882768.63270: stdout chunk (state=3): >>>/root <<< 22225 1726882768.63381: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882768.63434: stderr chunk (state=3): >>><<< 22225 1726882768.63437: stdout chunk (state=3): >>><<< 22225 1726882768.63455: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882768.63466: _low_level_execute_command(): starting 22225 1726882768.63471: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882768.6345384-23068-60715761283571 `" && echo ansible-tmp-1726882768.6345384-23068-60715761283571="` echo /root/.ansible/tmp/ansible-tmp-1726882768.6345384-23068-60715761283571 `" ) && sleep 0' 22225 1726882768.63923: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882768.63927: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found <<< 22225 1726882768.63929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 22225 1726882768.63937: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882768.63940: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882768.63983: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882768.63987: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882768.64057: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882768.66003: stdout chunk (state=3): >>>ansible-tmp-1726882768.6345384-23068-60715761283571=/root/.ansible/tmp/ansible-tmp-1726882768.6345384-23068-60715761283571 <<< 22225 1726882768.66129: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882768.66170: stderr chunk (state=3): >>><<< 22225 1726882768.66173: stdout chunk (state=3): >>><<< 22225 1726882768.66187: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882768.6345384-23068-60715761283571=/root/.ansible/tmp/ansible-tmp-1726882768.6345384-23068-60715761283571 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882768.66225: variable 'ansible_module_compression' from source: unknown 22225 1726882768.66258: ANSIBALLZ: Using lock for ping 22225 1726882768.66261: ANSIBALLZ: Acquiring lock 22225 1726882768.66264: ANSIBALLZ: Lock acquired: 140272891387376 22225 1726882768.66266: ANSIBALLZ: Creating module 22225 1726882768.74364: ANSIBALLZ: Writing module into payload 22225 1726882768.74410: ANSIBALLZ: Writing module 22225 1726882768.74428: ANSIBALLZ: Renaming module 22225 1726882768.74434: ANSIBALLZ: Done creating module 22225 1726882768.74448: variable 'ansible_facts' from source: unknown 22225 1726882768.74496: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882768.6345384-23068-60715761283571/AnsiballZ_ping.py 22225 1726882768.74596: Sending initial data 22225 1726882768.74600: Sent initial data (152 bytes) 22225 1726882768.75073: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882768.75076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882768.75079: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882768.75081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882768.75139: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882768.75142: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882768.75145: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882768.75203: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882768.76842: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 <<< 22225 1726882768.76846: stderr chunk (state=3): >>>debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22225 1726882768.76885: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22225 1726882768.76941: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2222502xecypi/tmp3z58oxmi /root/.ansible/tmp/ansible-tmp-1726882768.6345384-23068-60715761283571/AnsiballZ_ping.py <<< 22225 1726882768.76944: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882768.6345384-23068-60715761283571/AnsiballZ_ping.py" <<< 22225 1726882768.76990: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-2222502xecypi/tmp3z58oxmi" to remote "/root/.ansible/tmp/ansible-tmp-1726882768.6345384-23068-60715761283571/AnsiballZ_ping.py" <<< 22225 1726882768.76992: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882768.6345384-23068-60715761283571/AnsiballZ_ping.py" <<< 22225 1726882768.77554: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882768.77617: stderr chunk (state=3): >>><<< 22225 1726882768.77620: stdout chunk (state=3): >>><<< 22225 1726882768.77639: done transferring module to remote 22225 1726882768.77649: _low_level_execute_command(): starting 22225 1726882768.77653: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882768.6345384-23068-60715761283571/ /root/.ansible/tmp/ansible-tmp-1726882768.6345384-23068-60715761283571/AnsiballZ_ping.py && sleep 0' 22225 1726882768.78089: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882768.78092: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found <<< 22225 1726882768.78095: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 22225 1726882768.78100: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882768.78102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882768.78152: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882768.78158: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882768.78205: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882768.80007: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882768.80051: stderr chunk (state=3): >>><<< 22225 1726882768.80054: stdout chunk (state=3): >>><<< 22225 1726882768.80068: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882768.80071: _low_level_execute_command(): starting 22225 1726882768.80075: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882768.6345384-23068-60715761283571/AnsiballZ_ping.py && sleep 0' 22225 1726882768.80506: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882768.80509: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882768.80512: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882768.80514: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found <<< 22225 1726882768.80516: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882768.80561: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882768.80565: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882768.80629: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882768.96787: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 22225 1726882768.98026: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. <<< 22225 1726882768.98088: stderr chunk (state=3): >>><<< 22225 1726882768.98092: stdout chunk (state=3): >>><<< 22225 1726882768.98106: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. 22225 1726882768.98128: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882768.6345384-23068-60715761283571/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22225 1726882768.98137: _low_level_execute_command(): starting 22225 1726882768.98143: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882768.6345384-23068-60715761283571/ > /dev/null 2>&1 && sleep 0' 22225 1726882768.98616: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882768.98620: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882768.98625: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882768.98627: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882768.98678: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882768.98692: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882768.98694: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882768.98738: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882769.00621: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882769.00669: stderr chunk (state=3): >>><<< 22225 1726882769.00675: stdout chunk (state=3): >>><<< 22225 1726882769.00689: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882769.00698: handler run complete 22225 1726882769.00710: attempt loop complete, returning result 22225 1726882769.00713: _execute() done 22225 1726882769.00715: dumping result to json 22225 1726882769.00718: done dumping result, returning 22225 1726882769.00730: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affc7ec-ae25-ec05-55b7-00000000002c] 22225 1726882769.00735: sending task result for task 0affc7ec-ae25-ec05-55b7-00000000002c 22225 1726882769.00831: done sending task result for task 0affc7ec-ae25-ec05-55b7-00000000002c 22225 1726882769.00836: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "ping": "pong" } 22225 1726882769.00900: no more pending results, returning what we have 22225 1726882769.00903: results queue empty 22225 1726882769.00904: checking for any_errors_fatal 22225 1726882769.00911: done checking for any_errors_fatal 22225 1726882769.00911: checking for max_fail_percentage 22225 1726882769.00913: done checking for max_fail_percentage 22225 1726882769.00914: checking to see if all hosts have failed and the running result is not ok 22225 1726882769.00915: done checking to see if all hosts have failed 22225 1726882769.00916: getting the remaining hosts for this loop 22225 1726882769.00918: done getting the remaining hosts for this loop 22225 1726882769.00923: getting the next task for host managed_node1 22225 1726882769.00932: done getting next task for host managed_node1 22225 1726882769.00935: ^ task is: TASK: meta (role_complete) 22225 1726882769.00938: ^ state is: HOST STATE: block=3, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882769.00949: getting variables 22225 1726882769.00951: in VariableManager get_vars() 22225 1726882769.00997: Calling all_inventory to load vars for managed_node1 22225 1726882769.01000: Calling groups_inventory to load vars for managed_node1 22225 1726882769.01002: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882769.01013: Calling all_plugins_play to load vars for managed_node1 22225 1726882769.01015: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882769.01018: Calling groups_plugins_play to load vars for managed_node1 22225 1726882769.02145: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882769.03331: done with get_vars() 22225 1726882769.03350: done getting variables 22225 1726882769.03415: done queuing things up, now waiting for results queue to drain 22225 1726882769.03417: results queue empty 22225 1726882769.03417: checking for any_errors_fatal 22225 1726882769.03419: done checking for any_errors_fatal 22225 1726882769.03420: checking for max_fail_percentage 22225 1726882769.03420: done checking for max_fail_percentage 22225 1726882769.03421: checking to see if all hosts have failed and the running result is not ok 22225 1726882769.03423: done checking to see if all hosts have failed 22225 1726882769.03424: getting the remaining hosts for this loop 22225 1726882769.03425: done getting the remaining hosts for this loop 22225 1726882769.03427: getting the next task for host managed_node1 22225 1726882769.03430: done getting next task for host managed_node1 22225 1726882769.03432: ^ task is: TASK: Include the task 'assert_device_present.yml' 22225 1726882769.03433: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882769.03434: getting variables 22225 1726882769.03435: in VariableManager get_vars() 22225 1726882769.03445: Calling all_inventory to load vars for managed_node1 22225 1726882769.03446: Calling groups_inventory to load vars for managed_node1 22225 1726882769.03448: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882769.03451: Calling all_plugins_play to load vars for managed_node1 22225 1726882769.03453: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882769.03455: Calling groups_plugins_play to load vars for managed_node1 22225 1726882769.04287: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882769.05467: done with get_vars() 22225 1726882769.05490: done getting variables TASK [Include the task 'assert_device_present.yml'] **************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:47 Friday 20 September 2024 21:39:29 -0400 (0:00:00.462) 0:00:24.448 ****** 22225 1726882769.05546: entering _queue_task() for managed_node1/include_tasks 22225 1726882769.05818: worker is 1 (out of 1 available) 22225 1726882769.05837: exiting _queue_task() for managed_node1/include_tasks 22225 1726882769.05848: done queuing things up, now waiting for results queue to drain 22225 1726882769.05849: waiting for pending results... 22225 1726882769.06031: running TaskExecutor() for managed_node1/TASK: Include the task 'assert_device_present.yml' 22225 1726882769.06105: in run() - task 0affc7ec-ae25-ec05-55b7-00000000005c 22225 1726882769.06118: variable 'ansible_search_path' from source: unknown 22225 1726882769.06150: calling self._execute() 22225 1726882769.06229: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882769.06234: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882769.06243: variable 'omit' from source: magic vars 22225 1726882769.06538: variable 'ansible_distribution_major_version' from source: facts 22225 1726882769.06548: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882769.06552: _execute() done 22225 1726882769.06556: dumping result to json 22225 1726882769.06559: done dumping result, returning 22225 1726882769.06566: done running TaskExecutor() for managed_node1/TASK: Include the task 'assert_device_present.yml' [0affc7ec-ae25-ec05-55b7-00000000005c] 22225 1726882769.06571: sending task result for task 0affc7ec-ae25-ec05-55b7-00000000005c 22225 1726882769.06670: done sending task result for task 0affc7ec-ae25-ec05-55b7-00000000005c 22225 1726882769.06673: WORKER PROCESS EXITING 22225 1726882769.06704: no more pending results, returning what we have 22225 1726882769.06709: in VariableManager get_vars() 22225 1726882769.06757: Calling all_inventory to load vars for managed_node1 22225 1726882769.06760: Calling groups_inventory to load vars for managed_node1 22225 1726882769.06762: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882769.06776: Calling all_plugins_play to load vars for managed_node1 22225 1726882769.06778: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882769.06786: Calling groups_plugins_play to load vars for managed_node1 22225 1726882769.11250: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882769.13097: done with get_vars() 22225 1726882769.13121: variable 'ansible_search_path' from source: unknown 22225 1726882769.13137: we have included files to process 22225 1726882769.13138: generating all_blocks data 22225 1726882769.13140: done generating all_blocks data 22225 1726882769.13143: processing included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 22225 1726882769.13144: loading included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 22225 1726882769.13147: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 22225 1726882769.13303: in VariableManager get_vars() 22225 1726882769.13329: done with get_vars() 22225 1726882769.13439: done processing included file 22225 1726882769.13441: iterating over new_blocks loaded from include file 22225 1726882769.13443: in VariableManager get_vars() 22225 1726882769.13461: done with get_vars() 22225 1726882769.13462: filtering new block on tags 22225 1726882769.13481: done filtering new block on tags 22225 1726882769.13484: done iterating over new_blocks loaded from include file included: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed_node1 22225 1726882769.13489: extending task lists for all hosts with included blocks 22225 1726882769.15754: done extending task lists 22225 1726882769.15756: done processing included files 22225 1726882769.15757: results queue empty 22225 1726882769.15758: checking for any_errors_fatal 22225 1726882769.15759: done checking for any_errors_fatal 22225 1726882769.15760: checking for max_fail_percentage 22225 1726882769.15761: done checking for max_fail_percentage 22225 1726882769.15762: checking to see if all hosts have failed and the running result is not ok 22225 1726882769.15763: done checking to see if all hosts have failed 22225 1726882769.15764: getting the remaining hosts for this loop 22225 1726882769.15765: done getting the remaining hosts for this loop 22225 1726882769.15768: getting the next task for host managed_node1 22225 1726882769.15772: done getting next task for host managed_node1 22225 1726882769.15774: ^ task is: TASK: Include the task 'get_interface_stat.yml' 22225 1726882769.15777: ^ state is: HOST STATE: block=3, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882769.15782: getting variables 22225 1726882769.15783: in VariableManager get_vars() 22225 1726882769.15798: Calling all_inventory to load vars for managed_node1 22225 1726882769.15800: Calling groups_inventory to load vars for managed_node1 22225 1726882769.15803: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882769.15809: Calling all_plugins_play to load vars for managed_node1 22225 1726882769.15811: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882769.15815: Calling groups_plugins_play to load vars for managed_node1 22225 1726882769.16760: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882769.17950: done with get_vars() 22225 1726882769.17972: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 21:39:29 -0400 (0:00:00.125) 0:00:24.573 ****** 22225 1726882769.18050: entering _queue_task() for managed_node1/include_tasks 22225 1726882769.18418: worker is 1 (out of 1 available) 22225 1726882769.18435: exiting _queue_task() for managed_node1/include_tasks 22225 1726882769.18448: done queuing things up, now waiting for results queue to drain 22225 1726882769.18449: waiting for pending results... 22225 1726882769.18849: running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' 22225 1726882769.18859: in run() - task 0affc7ec-ae25-ec05-55b7-0000000002b5 22225 1726882769.18863: variable 'ansible_search_path' from source: unknown 22225 1726882769.18867: variable 'ansible_search_path' from source: unknown 22225 1726882769.18903: calling self._execute() 22225 1726882769.19020: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882769.19037: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882769.19068: variable 'omit' from source: magic vars 22225 1726882769.19484: variable 'ansible_distribution_major_version' from source: facts 22225 1726882769.19614: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882769.19619: _execute() done 22225 1726882769.19626: dumping result to json 22225 1726882769.19629: done dumping result, returning 22225 1726882769.19632: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' [0affc7ec-ae25-ec05-55b7-0000000002b5] 22225 1726882769.19635: sending task result for task 0affc7ec-ae25-ec05-55b7-0000000002b5 22225 1726882769.19833: done sending task result for task 0affc7ec-ae25-ec05-55b7-0000000002b5 22225 1726882769.19838: WORKER PROCESS EXITING 22225 1726882769.19865: no more pending results, returning what we have 22225 1726882769.19870: in VariableManager get_vars() 22225 1726882769.19911: Calling all_inventory to load vars for managed_node1 22225 1726882769.19914: Calling groups_inventory to load vars for managed_node1 22225 1726882769.19916: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882769.19929: Calling all_plugins_play to load vars for managed_node1 22225 1726882769.19932: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882769.19935: Calling groups_plugins_play to load vars for managed_node1 22225 1726882769.21534: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882769.23713: done with get_vars() 22225 1726882769.23735: variable 'ansible_search_path' from source: unknown 22225 1726882769.23737: variable 'ansible_search_path' from source: unknown 22225 1726882769.23774: we have included files to process 22225 1726882769.23776: generating all_blocks data 22225 1726882769.23777: done generating all_blocks data 22225 1726882769.23779: processing included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 22225 1726882769.23782: loading included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 22225 1726882769.23785: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 22225 1726882769.24021: done processing included file 22225 1726882769.24026: iterating over new_blocks loaded from include file 22225 1726882769.24027: in VariableManager get_vars() 22225 1726882769.24047: done with get_vars() 22225 1726882769.24048: filtering new block on tags 22225 1726882769.24063: done filtering new block on tags 22225 1726882769.24066: done iterating over new_blocks loaded from include file included: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node1 22225 1726882769.24070: extending task lists for all hosts with included blocks 22225 1726882769.24185: done extending task lists 22225 1726882769.24187: done processing included files 22225 1726882769.24187: results queue empty 22225 1726882769.24188: checking for any_errors_fatal 22225 1726882769.24192: done checking for any_errors_fatal 22225 1726882769.24193: checking for max_fail_percentage 22225 1726882769.24194: done checking for max_fail_percentage 22225 1726882769.24195: checking to see if all hosts have failed and the running result is not ok 22225 1726882769.24195: done checking to see if all hosts have failed 22225 1726882769.24196: getting the remaining hosts for this loop 22225 1726882769.24198: done getting the remaining hosts for this loop 22225 1726882769.24200: getting the next task for host managed_node1 22225 1726882769.24204: done getting next task for host managed_node1 22225 1726882769.24206: ^ task is: TASK: Get stat for interface {{ interface }} 22225 1726882769.24210: ^ state is: HOST STATE: block=3, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882769.24212: getting variables 22225 1726882769.24213: in VariableManager get_vars() 22225 1726882769.24229: Calling all_inventory to load vars for managed_node1 22225 1726882769.24231: Calling groups_inventory to load vars for managed_node1 22225 1726882769.24234: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882769.24240: Calling all_plugins_play to load vars for managed_node1 22225 1726882769.24243: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882769.24246: Calling groups_plugins_play to load vars for managed_node1 22225 1726882769.25817: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882769.28016: done with get_vars() 22225 1726882769.28044: done getting variables 22225 1726882769.28213: variable 'interface' from source: play vars TASK [Get stat for interface veth0] ******************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:39:29 -0400 (0:00:00.101) 0:00:24.675 ****** 22225 1726882769.28245: entering _queue_task() for managed_node1/stat 22225 1726882769.28601: worker is 1 (out of 1 available) 22225 1726882769.28616: exiting _queue_task() for managed_node1/stat 22225 1726882769.28832: done queuing things up, now waiting for results queue to drain 22225 1726882769.28835: waiting for pending results... 22225 1726882769.28941: running TaskExecutor() for managed_node1/TASK: Get stat for interface veth0 22225 1726882769.29172: in run() - task 0affc7ec-ae25-ec05-55b7-0000000003a0 22225 1726882769.29176: variable 'ansible_search_path' from source: unknown 22225 1726882769.29179: variable 'ansible_search_path' from source: unknown 22225 1726882769.29184: calling self._execute() 22225 1726882769.29271: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882769.29293: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882769.29308: variable 'omit' from source: magic vars 22225 1726882769.29698: variable 'ansible_distribution_major_version' from source: facts 22225 1726882769.29721: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882769.29734: variable 'omit' from source: magic vars 22225 1726882769.29783: variable 'omit' from source: magic vars 22225 1726882769.29886: variable 'interface' from source: play vars 22225 1726882769.29929: variable 'omit' from source: magic vars 22225 1726882769.29962: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22225 1726882769.30040: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22225 1726882769.30043: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22225 1726882769.30057: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882769.30074: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882769.30115: variable 'inventory_hostname' from source: host vars for 'managed_node1' 22225 1726882769.30128: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882769.30148: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882769.30257: Set connection var ansible_connection to ssh 22225 1726882769.30269: Set connection var ansible_pipelining to False 22225 1726882769.30366: Set connection var ansible_shell_executable to /bin/sh 22225 1726882769.30369: Set connection var ansible_timeout to 10 22225 1726882769.30372: Set connection var ansible_shell_type to sh 22225 1726882769.30374: Set connection var ansible_module_compression to ZIP_DEFLATED 22225 1726882769.30377: variable 'ansible_shell_executable' from source: unknown 22225 1726882769.30382: variable 'ansible_connection' from source: unknown 22225 1726882769.30385: variable 'ansible_module_compression' from source: unknown 22225 1726882769.30387: variable 'ansible_shell_type' from source: unknown 22225 1726882769.30389: variable 'ansible_shell_executable' from source: unknown 22225 1726882769.30392: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882769.30394: variable 'ansible_pipelining' from source: unknown 22225 1726882769.30396: variable 'ansible_timeout' from source: unknown 22225 1726882769.30398: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882769.30620: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 22225 1726882769.30643: variable 'omit' from source: magic vars 22225 1726882769.30657: starting attempt loop 22225 1726882769.30665: running the handler 22225 1726882769.30690: _low_level_execute_command(): starting 22225 1726882769.30707: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22225 1726882769.31552: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882769.31614: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882769.31637: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882769.31662: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882769.31767: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882769.33546: stdout chunk (state=3): >>>/root <<< 22225 1726882769.33755: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882769.33759: stdout chunk (state=3): >>><<< 22225 1726882769.33761: stderr chunk (state=3): >>><<< 22225 1726882769.33886: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882769.33891: _low_level_execute_command(): starting 22225 1726882769.33896: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882769.3378456-23079-280494831285815 `" && echo ansible-tmp-1726882769.3378456-23079-280494831285815="` echo /root/.ansible/tmp/ansible-tmp-1726882769.3378456-23079-280494831285815 `" ) && sleep 0' 22225 1726882769.34480: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22225 1726882769.34495: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882769.34508: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882769.34581: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882769.34585: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882769.34655: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882769.34698: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882769.34701: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882769.34782: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882769.36738: stdout chunk (state=3): >>>ansible-tmp-1726882769.3378456-23079-280494831285815=/root/.ansible/tmp/ansible-tmp-1726882769.3378456-23079-280494831285815 <<< 22225 1726882769.37029: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882769.37032: stdout chunk (state=3): >>><<< 22225 1726882769.37035: stderr chunk (state=3): >>><<< 22225 1726882769.37038: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882769.3378456-23079-280494831285815=/root/.ansible/tmp/ansible-tmp-1726882769.3378456-23079-280494831285815 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882769.37041: variable 'ansible_module_compression' from source: unknown 22225 1726882769.37083: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-2222502xecypi/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 22225 1726882769.37137: variable 'ansible_facts' from source: unknown 22225 1726882769.37228: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882769.3378456-23079-280494831285815/AnsiballZ_stat.py 22225 1726882769.37490: Sending initial data 22225 1726882769.37494: Sent initial data (153 bytes) 22225 1726882769.38046: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22225 1726882769.38142: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882769.38157: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882769.38170: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882769.38179: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882769.38265: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882769.39847: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22225 1726882769.39903: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22225 1726882769.39964: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2222502xecypi/tmpeyzmco2v /root/.ansible/tmp/ansible-tmp-1726882769.3378456-23079-280494831285815/AnsiballZ_stat.py <<< 22225 1726882769.39968: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882769.3378456-23079-280494831285815/AnsiballZ_stat.py" <<< 22225 1726882769.40009: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-2222502xecypi/tmpeyzmco2v" to remote "/root/.ansible/tmp/ansible-tmp-1726882769.3378456-23079-280494831285815/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882769.3378456-23079-280494831285815/AnsiballZ_stat.py" <<< 22225 1726882769.40856: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882769.40889: stderr chunk (state=3): >>><<< 22225 1726882769.40898: stdout chunk (state=3): >>><<< 22225 1726882769.40996: done transferring module to remote 22225 1726882769.40999: _low_level_execute_command(): starting 22225 1726882769.41002: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882769.3378456-23079-280494831285815/ /root/.ansible/tmp/ansible-tmp-1726882769.3378456-23079-280494831285815/AnsiballZ_stat.py && sleep 0' 22225 1726882769.41677: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22225 1726882769.41690: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882769.41706: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882769.41735: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found <<< 22225 1726882769.41874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882769.41960: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882769.41998: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882769.42016: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882769.42043: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882769.42226: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882769.44267: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882769.44297: stderr chunk (state=3): >>><<< 22225 1726882769.44300: stdout chunk (state=3): >>><<< 22225 1726882769.44409: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882769.44413: _low_level_execute_command(): starting 22225 1726882769.44416: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882769.3378456-23079-280494831285815/AnsiballZ_stat.py && sleep 0' 22225 1726882769.45682: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882769.45685: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found <<< 22225 1726882769.45688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 22225 1726882769.45690: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882769.45693: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882769.46186: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882769.46282: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882769.63032: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/veth0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 38428, "dev": 23, "nlink": 1, "atime": 1726882756.5130334, "mtime": 1726882756.5130334, "ctime": 1726882756.5130334, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/veth0", "lnk_target": "../../devices/virtual/net/veth0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/veth0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 22225 1726882769.64529: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. <<< 22225 1726882769.64606: stderr chunk (state=3): >>><<< 22225 1726882769.64617: stdout chunk (state=3): >>><<< 22225 1726882769.64645: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/veth0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 38428, "dev": 23, "nlink": 1, "atime": 1726882756.5130334, "mtime": 1726882756.5130334, "ctime": 1726882756.5130334, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/veth0", "lnk_target": "../../devices/virtual/net/veth0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/veth0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. 22225 1726882769.64713: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/veth0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882769.3378456-23079-280494831285815/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22225 1726882769.64734: _low_level_execute_command(): starting 22225 1726882769.64744: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882769.3378456-23079-280494831285815/ > /dev/null 2>&1 && sleep 0' 22225 1726882769.65579: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22225 1726882769.65941: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882769.65970: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882769.65989: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882769.66069: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882769.68081: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882769.68085: stdout chunk (state=3): >>><<< 22225 1726882769.68087: stderr chunk (state=3): >>><<< 22225 1726882769.68104: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882769.68115: handler run complete 22225 1726882769.68180: attempt loop complete, returning result 22225 1726882769.68200: _execute() done 22225 1726882769.68208: dumping result to json 22225 1726882769.68230: done dumping result, returning 22225 1726882769.68248: done running TaskExecutor() for managed_node1/TASK: Get stat for interface veth0 [0affc7ec-ae25-ec05-55b7-0000000003a0] 22225 1726882769.68258: sending task result for task 0affc7ec-ae25-ec05-55b7-0000000003a0 ok: [managed_node1] => { "changed": false, "stat": { "atime": 1726882756.5130334, "block_size": 4096, "blocks": 0, "ctime": 1726882756.5130334, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 38428, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/veth0", "lnk_target": "../../devices/virtual/net/veth0", "mode": "0777", "mtime": 1726882756.5130334, "nlink": 1, "path": "/sys/class/net/veth0", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 22225 1726882769.68803: no more pending results, returning what we have 22225 1726882769.68808: results queue empty 22225 1726882769.68809: checking for any_errors_fatal 22225 1726882769.68812: done checking for any_errors_fatal 22225 1726882769.68812: checking for max_fail_percentage 22225 1726882769.68814: done checking for max_fail_percentage 22225 1726882769.68815: checking to see if all hosts have failed and the running result is not ok 22225 1726882769.68816: done checking to see if all hosts have failed 22225 1726882769.68817: getting the remaining hosts for this loop 22225 1726882769.68819: done getting the remaining hosts for this loop 22225 1726882769.68826: getting the next task for host managed_node1 22225 1726882769.68837: done getting next task for host managed_node1 22225 1726882769.68840: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 22225 1726882769.68844: ^ state is: HOST STATE: block=3, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882769.68850: getting variables 22225 1726882769.68852: in VariableManager get_vars() 22225 1726882769.68897: Calling all_inventory to load vars for managed_node1 22225 1726882769.68901: Calling groups_inventory to load vars for managed_node1 22225 1726882769.68903: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882769.69349: done sending task result for task 0affc7ec-ae25-ec05-55b7-0000000003a0 22225 1726882769.69353: WORKER PROCESS EXITING 22225 1726882769.69366: Calling all_plugins_play to load vars for managed_node1 22225 1726882769.69369: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882769.69373: Calling groups_plugins_play to load vars for managed_node1 22225 1726882769.71305: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882769.73587: done with get_vars() 22225 1726882769.73617: done getting variables 22225 1726882769.74017: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 22225 1726882769.74248: variable 'interface' from source: play vars TASK [Assert that the interface is present - 'veth0'] ************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 21:39:29 -0400 (0:00:00.460) 0:00:25.135 ****** 22225 1726882769.74283: entering _queue_task() for managed_node1/assert 22225 1726882769.74285: Creating lock for assert 22225 1726882769.74958: worker is 1 (out of 1 available) 22225 1726882769.74971: exiting _queue_task() for managed_node1/assert 22225 1726882769.74984: done queuing things up, now waiting for results queue to drain 22225 1726882769.74986: waiting for pending results... 22225 1726882769.75221: running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'veth0' 22225 1726882769.75638: in run() - task 0affc7ec-ae25-ec05-55b7-0000000002b6 22225 1726882769.75642: variable 'ansible_search_path' from source: unknown 22225 1726882769.75644: variable 'ansible_search_path' from source: unknown 22225 1726882769.75647: calling self._execute() 22225 1726882769.75913: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882769.76022: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882769.76029: variable 'omit' from source: magic vars 22225 1726882769.76742: variable 'ansible_distribution_major_version' from source: facts 22225 1726882769.76765: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882769.76805: variable 'omit' from source: magic vars 22225 1726882769.76857: variable 'omit' from source: magic vars 22225 1726882769.76977: variable 'interface' from source: play vars 22225 1726882769.77008: variable 'omit' from source: magic vars 22225 1726882769.77060: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22225 1726882769.77110: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22225 1726882769.77140: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22225 1726882769.77165: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882769.77185: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882769.77231: variable 'inventory_hostname' from source: host vars for 'managed_node1' 22225 1726882769.77243: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882769.77251: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882769.77368: Set connection var ansible_connection to ssh 22225 1726882769.77388: Set connection var ansible_pipelining to False 22225 1726882769.77404: Set connection var ansible_shell_executable to /bin/sh 22225 1726882769.77415: Set connection var ansible_timeout to 10 22225 1726882769.77424: Set connection var ansible_shell_type to sh 22225 1726882769.77441: Set connection var ansible_module_compression to ZIP_DEFLATED 22225 1726882769.77475: variable 'ansible_shell_executable' from source: unknown 22225 1726882769.77488: variable 'ansible_connection' from source: unknown 22225 1726882769.77496: variable 'ansible_module_compression' from source: unknown 22225 1726882769.77503: variable 'ansible_shell_type' from source: unknown 22225 1726882769.77510: variable 'ansible_shell_executable' from source: unknown 22225 1726882769.77517: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882769.77528: variable 'ansible_pipelining' from source: unknown 22225 1726882769.77534: variable 'ansible_timeout' from source: unknown 22225 1726882769.77544: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882769.77713: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22225 1726882769.77734: variable 'omit' from source: magic vars 22225 1726882769.77768: starting attempt loop 22225 1726882769.77771: running the handler 22225 1726882769.77909: variable 'interface_stat' from source: set_fact 22225 1726882769.77987: Evaluated conditional (interface_stat.stat.exists): True 22225 1726882769.77990: handler run complete 22225 1726882769.77993: attempt loop complete, returning result 22225 1726882769.77995: _execute() done 22225 1726882769.77997: dumping result to json 22225 1726882769.78000: done dumping result, returning 22225 1726882769.78001: done running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'veth0' [0affc7ec-ae25-ec05-55b7-0000000002b6] 22225 1726882769.78004: sending task result for task 0affc7ec-ae25-ec05-55b7-0000000002b6 ok: [managed_node1] => { "changed": false } MSG: All assertions passed 22225 1726882769.78312: no more pending results, returning what we have 22225 1726882769.78316: results queue empty 22225 1726882769.78318: checking for any_errors_fatal 22225 1726882769.78331: done checking for any_errors_fatal 22225 1726882769.78332: checking for max_fail_percentage 22225 1726882769.78334: done checking for max_fail_percentage 22225 1726882769.78335: checking to see if all hosts have failed and the running result is not ok 22225 1726882769.78337: done checking to see if all hosts have failed 22225 1726882769.78337: getting the remaining hosts for this loop 22225 1726882769.78339: done getting the remaining hosts for this loop 22225 1726882769.78344: getting the next task for host managed_node1 22225 1726882769.78355: done getting next task for host managed_node1 22225 1726882769.78358: ^ task is: TASK: Include the task 'assert_profile_present.yml' 22225 1726882769.78361: ^ state is: HOST STATE: block=3, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882769.78365: getting variables 22225 1726882769.78367: in VariableManager get_vars() 22225 1726882769.78418: Calling all_inventory to load vars for managed_node1 22225 1726882769.78424: Calling groups_inventory to load vars for managed_node1 22225 1726882769.78427: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882769.78434: done sending task result for task 0affc7ec-ae25-ec05-55b7-0000000002b6 22225 1726882769.78437: WORKER PROCESS EXITING 22225 1726882769.78449: Calling all_plugins_play to load vars for managed_node1 22225 1726882769.78452: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882769.78455: Calling groups_plugins_play to load vars for managed_node1 22225 1726882769.81246: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882769.83540: done with get_vars() 22225 1726882769.83566: done getting variables TASK [Include the task 'assert_profile_present.yml'] *************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:49 Friday 20 September 2024 21:39:29 -0400 (0:00:00.093) 0:00:25.229 ****** 22225 1726882769.83669: entering _queue_task() for managed_node1/include_tasks 22225 1726882769.84037: worker is 1 (out of 1 available) 22225 1726882769.84053: exiting _queue_task() for managed_node1/include_tasks 22225 1726882769.84068: done queuing things up, now waiting for results queue to drain 22225 1726882769.84069: waiting for pending results... 22225 1726882769.84324: running TaskExecutor() for managed_node1/TASK: Include the task 'assert_profile_present.yml' 22225 1726882769.84428: in run() - task 0affc7ec-ae25-ec05-55b7-00000000005d 22225 1726882769.84450: variable 'ansible_search_path' from source: unknown 22225 1726882769.84498: calling self._execute() 22225 1726882769.84611: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882769.84625: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882769.84641: variable 'omit' from source: magic vars 22225 1726882769.85053: variable 'ansible_distribution_major_version' from source: facts 22225 1726882769.85070: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882769.85080: _execute() done 22225 1726882769.85088: dumping result to json 22225 1726882769.85096: done dumping result, returning 22225 1726882769.85140: done running TaskExecutor() for managed_node1/TASK: Include the task 'assert_profile_present.yml' [0affc7ec-ae25-ec05-55b7-00000000005d] 22225 1726882769.85143: sending task result for task 0affc7ec-ae25-ec05-55b7-00000000005d 22225 1726882769.85265: no more pending results, returning what we have 22225 1726882769.85271: in VariableManager get_vars() 22225 1726882769.85326: Calling all_inventory to load vars for managed_node1 22225 1726882769.85329: Calling groups_inventory to load vars for managed_node1 22225 1726882769.85331: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882769.85339: done sending task result for task 0affc7ec-ae25-ec05-55b7-00000000005d 22225 1726882769.85343: WORKER PROCESS EXITING 22225 1726882769.85359: Calling all_plugins_play to load vars for managed_node1 22225 1726882769.85362: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882769.85365: Calling groups_plugins_play to load vars for managed_node1 22225 1726882769.87121: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882769.89260: done with get_vars() 22225 1726882769.89286: variable 'ansible_search_path' from source: unknown 22225 1726882769.89302: we have included files to process 22225 1726882769.89303: generating all_blocks data 22225 1726882769.89306: done generating all_blocks data 22225 1726882769.89310: processing included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 22225 1726882769.89311: loading included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 22225 1726882769.89314: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 22225 1726882769.89529: in VariableManager get_vars() 22225 1726882769.89555: done with get_vars() 22225 1726882769.89843: done processing included file 22225 1726882769.89845: iterating over new_blocks loaded from include file 22225 1726882769.89847: in VariableManager get_vars() 22225 1726882769.89864: done with get_vars() 22225 1726882769.89866: filtering new block on tags 22225 1726882769.89893: done filtering new block on tags 22225 1726882769.89896: done iterating over new_blocks loaded from include file included: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node1 22225 1726882769.89901: extending task lists for all hosts with included blocks 22225 1726882769.92456: done extending task lists 22225 1726882769.92458: done processing included files 22225 1726882769.92459: results queue empty 22225 1726882769.92459: checking for any_errors_fatal 22225 1726882769.92463: done checking for any_errors_fatal 22225 1726882769.92464: checking for max_fail_percentage 22225 1726882769.92465: done checking for max_fail_percentage 22225 1726882769.92466: checking to see if all hosts have failed and the running result is not ok 22225 1726882769.92467: done checking to see if all hosts have failed 22225 1726882769.92468: getting the remaining hosts for this loop 22225 1726882769.92469: done getting the remaining hosts for this loop 22225 1726882769.92472: getting the next task for host managed_node1 22225 1726882769.92476: done getting next task for host managed_node1 22225 1726882769.92478: ^ task is: TASK: Include the task 'get_profile_stat.yml' 22225 1726882769.92481: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882769.92484: getting variables 22225 1726882769.92485: in VariableManager get_vars() 22225 1726882769.92500: Calling all_inventory to load vars for managed_node1 22225 1726882769.92508: Calling groups_inventory to load vars for managed_node1 22225 1726882769.92511: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882769.92518: Calling all_plugins_play to load vars for managed_node1 22225 1726882769.92520: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882769.92526: Calling groups_plugins_play to load vars for managed_node1 22225 1726882769.94105: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882769.96309: done with get_vars() 22225 1726882769.96337: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 21:39:29 -0400 (0:00:00.127) 0:00:25.356 ****** 22225 1726882769.96424: entering _queue_task() for managed_node1/include_tasks 22225 1726882769.96801: worker is 1 (out of 1 available) 22225 1726882769.96815: exiting _queue_task() for managed_node1/include_tasks 22225 1726882769.96830: done queuing things up, now waiting for results queue to drain 22225 1726882769.96832: waiting for pending results... 22225 1726882769.97142: running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' 22225 1726882769.97227: in run() - task 0affc7ec-ae25-ec05-55b7-0000000003b8 22225 1726882769.97429: variable 'ansible_search_path' from source: unknown 22225 1726882769.97433: variable 'ansible_search_path' from source: unknown 22225 1726882769.97436: calling self._execute() 22225 1726882769.97439: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882769.97442: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882769.97445: variable 'omit' from source: magic vars 22225 1726882769.97810: variable 'ansible_distribution_major_version' from source: facts 22225 1726882769.97936: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882769.97955: _execute() done 22225 1726882769.97965: dumping result to json 22225 1726882769.97972: done dumping result, returning 22225 1726882769.97982: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' [0affc7ec-ae25-ec05-55b7-0000000003b8] 22225 1726882769.97993: sending task result for task 0affc7ec-ae25-ec05-55b7-0000000003b8 22225 1726882769.98164: no more pending results, returning what we have 22225 1726882769.98170: in VariableManager get_vars() 22225 1726882769.98228: Calling all_inventory to load vars for managed_node1 22225 1726882769.98232: Calling groups_inventory to load vars for managed_node1 22225 1726882769.98234: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882769.98252: Calling all_plugins_play to load vars for managed_node1 22225 1726882769.98255: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882769.98258: Calling groups_plugins_play to load vars for managed_node1 22225 1726882769.98959: done sending task result for task 0affc7ec-ae25-ec05-55b7-0000000003b8 22225 1726882769.98962: WORKER PROCESS EXITING 22225 1726882770.00723: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882770.03065: done with get_vars() 22225 1726882770.03092: variable 'ansible_search_path' from source: unknown 22225 1726882770.03094: variable 'ansible_search_path' from source: unknown 22225 1726882770.03148: we have included files to process 22225 1726882770.03149: generating all_blocks data 22225 1726882770.03151: done generating all_blocks data 22225 1726882770.03152: processing included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 22225 1726882770.03153: loading included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 22225 1726882770.03156: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 22225 1726882770.04518: done processing included file 22225 1726882770.04520: iterating over new_blocks loaded from include file 22225 1726882770.04525: in VariableManager get_vars() 22225 1726882770.04555: done with get_vars() 22225 1726882770.04557: filtering new block on tags 22225 1726882770.04584: done filtering new block on tags 22225 1726882770.04587: in VariableManager get_vars() 22225 1726882770.04608: done with get_vars() 22225 1726882770.04610: filtering new block on tags 22225 1726882770.04636: done filtering new block on tags 22225 1726882770.04638: done iterating over new_blocks loaded from include file included: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node1 22225 1726882770.04644: extending task lists for all hosts with included blocks 22225 1726882770.04865: done extending task lists 22225 1726882770.04867: done processing included files 22225 1726882770.04868: results queue empty 22225 1726882770.04868: checking for any_errors_fatal 22225 1726882770.04879: done checking for any_errors_fatal 22225 1726882770.04880: checking for max_fail_percentage 22225 1726882770.04881: done checking for max_fail_percentage 22225 1726882770.04882: checking to see if all hosts have failed and the running result is not ok 22225 1726882770.04883: done checking to see if all hosts have failed 22225 1726882770.04883: getting the remaining hosts for this loop 22225 1726882770.04885: done getting the remaining hosts for this loop 22225 1726882770.04888: getting the next task for host managed_node1 22225 1726882770.04892: done getting next task for host managed_node1 22225 1726882770.04895: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 22225 1726882770.04898: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882770.04901: getting variables 22225 1726882770.04902: in VariableManager get_vars() 22225 1726882770.04975: Calling all_inventory to load vars for managed_node1 22225 1726882770.04978: Calling groups_inventory to load vars for managed_node1 22225 1726882770.04988: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882770.04994: Calling all_plugins_play to load vars for managed_node1 22225 1726882770.04997: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882770.05000: Calling groups_plugins_play to load vars for managed_node1 22225 1726882770.06538: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882770.08846: done with get_vars() 22225 1726882770.08887: done getting variables 22225 1726882770.08945: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 21:39:30 -0400 (0:00:00.125) 0:00:25.482 ****** 22225 1726882770.08993: entering _queue_task() for managed_node1/set_fact 22225 1726882770.09658: worker is 1 (out of 1 available) 22225 1726882770.09670: exiting _queue_task() for managed_node1/set_fact 22225 1726882770.09684: done queuing things up, now waiting for results queue to drain 22225 1726882770.09685: waiting for pending results... 22225 1726882770.09782: running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag 22225 1726882770.09921: in run() - task 0affc7ec-ae25-ec05-55b7-0000000004b0 22225 1726882770.09947: variable 'ansible_search_path' from source: unknown 22225 1726882770.09955: variable 'ansible_search_path' from source: unknown 22225 1726882770.10002: calling self._execute() 22225 1726882770.10117: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882770.10139: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882770.10154: variable 'omit' from source: magic vars 22225 1726882770.10950: variable 'ansible_distribution_major_version' from source: facts 22225 1726882770.10954: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882770.10956: variable 'omit' from source: magic vars 22225 1726882770.10958: variable 'omit' from source: magic vars 22225 1726882770.11044: variable 'omit' from source: magic vars 22225 1726882770.11100: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22225 1726882770.11208: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22225 1726882770.11302: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22225 1726882770.11327: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882770.11353: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882770.11419: variable 'inventory_hostname' from source: host vars for 'managed_node1' 22225 1726882770.11715: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882770.11718: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882770.11731: Set connection var ansible_connection to ssh 22225 1726882770.11747: Set connection var ansible_pipelining to False 22225 1726882770.11760: Set connection var ansible_shell_executable to /bin/sh 22225 1726882770.11770: Set connection var ansible_timeout to 10 22225 1726882770.11776: Set connection var ansible_shell_type to sh 22225 1726882770.11789: Set connection var ansible_module_compression to ZIP_DEFLATED 22225 1726882770.11854: variable 'ansible_shell_executable' from source: unknown 22225 1726882770.11939: variable 'ansible_connection' from source: unknown 22225 1726882770.11949: variable 'ansible_module_compression' from source: unknown 22225 1726882770.11956: variable 'ansible_shell_type' from source: unknown 22225 1726882770.11962: variable 'ansible_shell_executable' from source: unknown 22225 1726882770.11969: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882770.11977: variable 'ansible_pipelining' from source: unknown 22225 1726882770.11988: variable 'ansible_timeout' from source: unknown 22225 1726882770.11997: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882770.12302: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22225 1726882770.12388: variable 'omit' from source: magic vars 22225 1726882770.12592: starting attempt loop 22225 1726882770.12595: running the handler 22225 1726882770.12598: handler run complete 22225 1726882770.12600: attempt loop complete, returning result 22225 1726882770.12602: _execute() done 22225 1726882770.12605: dumping result to json 22225 1726882770.12607: done dumping result, returning 22225 1726882770.12609: done running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag [0affc7ec-ae25-ec05-55b7-0000000004b0] 22225 1726882770.12611: sending task result for task 0affc7ec-ae25-ec05-55b7-0000000004b0 22225 1726882770.12689: done sending task result for task 0affc7ec-ae25-ec05-55b7-0000000004b0 ok: [managed_node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 22225 1726882770.12762: no more pending results, returning what we have 22225 1726882770.12767: results queue empty 22225 1726882770.12768: checking for any_errors_fatal 22225 1726882770.12770: done checking for any_errors_fatal 22225 1726882770.12771: checking for max_fail_percentage 22225 1726882770.12773: done checking for max_fail_percentage 22225 1726882770.12774: checking to see if all hosts have failed and the running result is not ok 22225 1726882770.12775: done checking to see if all hosts have failed 22225 1726882770.12776: getting the remaining hosts for this loop 22225 1726882770.12778: done getting the remaining hosts for this loop 22225 1726882770.12786: getting the next task for host managed_node1 22225 1726882770.12793: done getting next task for host managed_node1 22225 1726882770.12797: ^ task is: TASK: Stat profile file 22225 1726882770.12804: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882770.12809: getting variables 22225 1726882770.12811: in VariableManager get_vars() 22225 1726882770.12863: Calling all_inventory to load vars for managed_node1 22225 1726882770.12867: Calling groups_inventory to load vars for managed_node1 22225 1726882770.12870: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882770.12887: Calling all_plugins_play to load vars for managed_node1 22225 1726882770.12891: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882770.12895: Calling groups_plugins_play to load vars for managed_node1 22225 1726882770.13439: WORKER PROCESS EXITING 22225 1726882770.15050: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882770.18129: done with get_vars() 22225 1726882770.18165: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 21:39:30 -0400 (0:00:00.092) 0:00:25.575 ****** 22225 1726882770.18296: entering _queue_task() for managed_node1/stat 22225 1726882770.18702: worker is 1 (out of 1 available) 22225 1726882770.18715: exiting _queue_task() for managed_node1/stat 22225 1726882770.18839: done queuing things up, now waiting for results queue to drain 22225 1726882770.18841: waiting for pending results... 22225 1726882770.19055: running TaskExecutor() for managed_node1/TASK: Stat profile file 22225 1726882770.19202: in run() - task 0affc7ec-ae25-ec05-55b7-0000000004b1 22225 1726882770.19227: variable 'ansible_search_path' from source: unknown 22225 1726882770.19237: variable 'ansible_search_path' from source: unknown 22225 1726882770.19292: calling self._execute() 22225 1726882770.19513: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882770.19518: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882770.19524: variable 'omit' from source: magic vars 22225 1726882770.20330: variable 'ansible_distribution_major_version' from source: facts 22225 1726882770.20334: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882770.20336: variable 'omit' from source: magic vars 22225 1726882770.20339: variable 'omit' from source: magic vars 22225 1726882770.20517: variable 'profile' from source: include params 22225 1726882770.20636: variable 'interface' from source: play vars 22225 1726882770.20724: variable 'interface' from source: play vars 22225 1726882770.20875: variable 'omit' from source: magic vars 22225 1726882770.20916: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22225 1726882770.20963: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22225 1726882770.21007: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22225 1726882770.21115: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882770.21139: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882770.21174: variable 'inventory_hostname' from source: host vars for 'managed_node1' 22225 1726882770.21200: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882770.21425: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882770.21640: Set connection var ansible_connection to ssh 22225 1726882770.21643: Set connection var ansible_pipelining to False 22225 1726882770.21646: Set connection var ansible_shell_executable to /bin/sh 22225 1726882770.21648: Set connection var ansible_timeout to 10 22225 1726882770.21651: Set connection var ansible_shell_type to sh 22225 1726882770.21653: Set connection var ansible_module_compression to ZIP_DEFLATED 22225 1726882770.21655: variable 'ansible_shell_executable' from source: unknown 22225 1726882770.21657: variable 'ansible_connection' from source: unknown 22225 1726882770.21659: variable 'ansible_module_compression' from source: unknown 22225 1726882770.21661: variable 'ansible_shell_type' from source: unknown 22225 1726882770.21664: variable 'ansible_shell_executable' from source: unknown 22225 1726882770.21666: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882770.21667: variable 'ansible_pipelining' from source: unknown 22225 1726882770.21670: variable 'ansible_timeout' from source: unknown 22225 1726882770.21672: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882770.22144: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 22225 1726882770.22161: variable 'omit' from source: magic vars 22225 1726882770.22172: starting attempt loop 22225 1726882770.22186: running the handler 22225 1726882770.22204: _low_level_execute_command(): starting 22225 1726882770.22398: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22225 1726882770.23550: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882770.23567: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882770.23583: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882770.23647: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882770.23843: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882770.24047: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882770.25812: stdout chunk (state=3): >>>/root <<< 22225 1726882770.25910: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882770.25992: stderr chunk (state=3): >>><<< 22225 1726882770.26018: stdout chunk (state=3): >>><<< 22225 1726882770.26359: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882770.26363: _low_level_execute_command(): starting 22225 1726882770.26367: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882770.2625492-23118-262731424782635 `" && echo ansible-tmp-1726882770.2625492-23118-262731424782635="` echo /root/.ansible/tmp/ansible-tmp-1726882770.2625492-23118-262731424782635 `" ) && sleep 0' 22225 1726882770.27865: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882770.27882: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882770.27903: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882770.28059: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882770.28062: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882770.28141: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882770.28214: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882770.30214: stdout chunk (state=3): >>>ansible-tmp-1726882770.2625492-23118-262731424782635=/root/.ansible/tmp/ansible-tmp-1726882770.2625492-23118-262731424782635 <<< 22225 1726882770.30331: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882770.30599: stderr chunk (state=3): >>><<< 22225 1726882770.30927: stdout chunk (state=3): >>><<< 22225 1726882770.30931: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882770.2625492-23118-262731424782635=/root/.ansible/tmp/ansible-tmp-1726882770.2625492-23118-262731424782635 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882770.30933: variable 'ansible_module_compression' from source: unknown 22225 1726882770.30936: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-2222502xecypi/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 22225 1726882770.30938: variable 'ansible_facts' from source: unknown 22225 1726882770.31136: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882770.2625492-23118-262731424782635/AnsiballZ_stat.py 22225 1726882770.31515: Sending initial data 22225 1726882770.31528: Sent initial data (153 bytes) 22225 1726882770.32571: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882770.32669: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882770.32685: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882770.32691: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882770.33005: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882770.34615: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 22225 1726882770.34627: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 22225 1726882770.34634: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 22225 1726882770.34642: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 22225 1726882770.34650: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 <<< 22225 1726882770.34658: stderr chunk (state=3): >>>debug2: Server supports extension "fsync@openssh.com" revision 1 <<< 22225 1726882770.34665: stderr chunk (state=3): >>>debug2: Server supports extension "lsetstat@openssh.com" revision 1 <<< 22225 1726882770.34672: stderr chunk (state=3): >>>debug2: Server supports extension "limits@openssh.com" revision 1 <<< 22225 1726882770.34682: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 22225 1726882770.34685: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 <<< 22225 1726882770.34693: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" <<< 22225 1726882770.34706: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22225 1726882770.34781: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22225 1726882770.34850: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2222502xecypi/tmpkijyzz2e /root/.ansible/tmp/ansible-tmp-1726882770.2625492-23118-262731424782635/AnsiballZ_stat.py <<< 22225 1726882770.34853: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882770.2625492-23118-262731424782635/AnsiballZ_stat.py" <<< 22225 1726882770.34891: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-2222502xecypi/tmpkijyzz2e" to remote "/root/.ansible/tmp/ansible-tmp-1726882770.2625492-23118-262731424782635/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882770.2625492-23118-262731424782635/AnsiballZ_stat.py" <<< 22225 1726882770.36157: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882770.36160: stdout chunk (state=3): >>><<< 22225 1726882770.36163: stderr chunk (state=3): >>><<< 22225 1726882770.36165: done transferring module to remote 22225 1726882770.36167: _low_level_execute_command(): starting 22225 1726882770.36169: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882770.2625492-23118-262731424782635/ /root/.ansible/tmp/ansible-tmp-1726882770.2625492-23118-262731424782635/AnsiballZ_stat.py && sleep 0' 22225 1726882770.37381: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882770.37430: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882770.37448: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882770.37470: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882770.37604: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882770.39444: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882770.39520: stderr chunk (state=3): >>><<< 22225 1726882770.39525: stdout chunk (state=3): >>><<< 22225 1726882770.39543: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882770.39546: _low_level_execute_command(): starting 22225 1726882770.39552: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882770.2625492-23118-262731424782635/AnsiballZ_stat.py && sleep 0' 22225 1726882770.40203: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22225 1726882770.40212: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882770.40225: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882770.40240: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882770.40252: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 <<< 22225 1726882770.40259: stderr chunk (state=3): >>>debug2: match not found <<< 22225 1726882770.40269: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882770.40290: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 22225 1726882770.40427: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.7 is address <<< 22225 1726882770.40430: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 22225 1726882770.40433: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882770.40436: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882770.40439: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882770.40442: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 <<< 22225 1726882770.40444: stderr chunk (state=3): >>>debug2: match found <<< 22225 1726882770.40446: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882770.40448: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882770.40451: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882770.40457: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882770.40543: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882770.57125: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-veth0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 22225 1726882770.58568: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. <<< 22225 1726882770.58572: stdout chunk (state=3): >>><<< 22225 1726882770.58574: stderr chunk (state=3): >>><<< 22225 1726882770.58829: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-veth0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. 22225 1726882770.58834: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-veth0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882770.2625492-23118-262731424782635/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22225 1726882770.58837: _low_level_execute_command(): starting 22225 1726882770.58840: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882770.2625492-23118-262731424782635/ > /dev/null 2>&1 && sleep 0' 22225 1726882770.60040: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882770.60084: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882770.60116: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882770.60124: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882770.60237: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882770.62463: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882770.62466: stderr chunk (state=3): >>><<< 22225 1726882770.62469: stdout chunk (state=3): >>><<< 22225 1726882770.62472: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882770.62474: handler run complete 22225 1726882770.62496: attempt loop complete, returning result 22225 1726882770.62499: _execute() done 22225 1726882770.62502: dumping result to json 22225 1726882770.62504: done dumping result, returning 22225 1726882770.62581: done running TaskExecutor() for managed_node1/TASK: Stat profile file [0affc7ec-ae25-ec05-55b7-0000000004b1] 22225 1726882770.62585: sending task result for task 0affc7ec-ae25-ec05-55b7-0000000004b1 22225 1726882770.62662: done sending task result for task 0affc7ec-ae25-ec05-55b7-0000000004b1 22225 1726882770.62665: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 22225 1726882770.62758: no more pending results, returning what we have 22225 1726882770.62763: results queue empty 22225 1726882770.62765: checking for any_errors_fatal 22225 1726882770.62772: done checking for any_errors_fatal 22225 1726882770.62773: checking for max_fail_percentage 22225 1726882770.62775: done checking for max_fail_percentage 22225 1726882770.62776: checking to see if all hosts have failed and the running result is not ok 22225 1726882770.62777: done checking to see if all hosts have failed 22225 1726882770.62778: getting the remaining hosts for this loop 22225 1726882770.62783: done getting the remaining hosts for this loop 22225 1726882770.62788: getting the next task for host managed_node1 22225 1726882770.62801: done getting next task for host managed_node1 22225 1726882770.62805: ^ task is: TASK: Set NM profile exist flag based on the profile files 22225 1726882770.62809: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882770.62815: getting variables 22225 1726882770.62817: in VariableManager get_vars() 22225 1726882770.62866: Calling all_inventory to load vars for managed_node1 22225 1726882770.62870: Calling groups_inventory to load vars for managed_node1 22225 1726882770.62872: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882770.62886: Calling all_plugins_play to load vars for managed_node1 22225 1726882770.62889: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882770.62892: Calling groups_plugins_play to load vars for managed_node1 22225 1726882770.66109: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882770.68519: done with get_vars() 22225 1726882770.68549: done getting variables 22225 1726882770.68616: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 21:39:30 -0400 (0:00:00.503) 0:00:26.079 ****** 22225 1726882770.68652: entering _queue_task() for managed_node1/set_fact 22225 1726882770.69372: worker is 1 (out of 1 available) 22225 1726882770.69389: exiting _queue_task() for managed_node1/set_fact 22225 1726882770.69400: done queuing things up, now waiting for results queue to drain 22225 1726882770.69401: waiting for pending results... 22225 1726882770.69692: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files 22225 1726882770.69758: in run() - task 0affc7ec-ae25-ec05-55b7-0000000004b2 22225 1726882770.69789: variable 'ansible_search_path' from source: unknown 22225 1726882770.69898: variable 'ansible_search_path' from source: unknown 22225 1726882770.69902: calling self._execute() 22225 1726882770.69949: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882770.69962: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882770.69977: variable 'omit' from source: magic vars 22225 1726882770.70428: variable 'ansible_distribution_major_version' from source: facts 22225 1726882770.70477: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882770.70673: variable 'profile_stat' from source: set_fact 22225 1726882770.70699: Evaluated conditional (profile_stat.stat.exists): False 22225 1726882770.70707: when evaluation is False, skipping this task 22225 1726882770.70714: _execute() done 22225 1726882770.70724: dumping result to json 22225 1726882770.70733: done dumping result, returning 22225 1726882770.70744: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files [0affc7ec-ae25-ec05-55b7-0000000004b2] 22225 1726882770.70757: sending task result for task 0affc7ec-ae25-ec05-55b7-0000000004b2 skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 22225 1726882770.70934: no more pending results, returning what we have 22225 1726882770.70939: results queue empty 22225 1726882770.70940: checking for any_errors_fatal 22225 1726882770.70949: done checking for any_errors_fatal 22225 1726882770.70949: checking for max_fail_percentage 22225 1726882770.70951: done checking for max_fail_percentage 22225 1726882770.70952: checking to see if all hosts have failed and the running result is not ok 22225 1726882770.70953: done checking to see if all hosts have failed 22225 1726882770.70954: getting the remaining hosts for this loop 22225 1726882770.70956: done getting the remaining hosts for this loop 22225 1726882770.70960: getting the next task for host managed_node1 22225 1726882770.70968: done getting next task for host managed_node1 22225 1726882770.70971: ^ task is: TASK: Get NM profile info 22225 1726882770.70977: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882770.70986: getting variables 22225 1726882770.70987: in VariableManager get_vars() 22225 1726882770.71037: Calling all_inventory to load vars for managed_node1 22225 1726882770.71040: Calling groups_inventory to load vars for managed_node1 22225 1726882770.71042: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882770.71059: Calling all_plugins_play to load vars for managed_node1 22225 1726882770.71062: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882770.71066: Calling groups_plugins_play to load vars for managed_node1 22225 1726882770.71743: done sending task result for task 0affc7ec-ae25-ec05-55b7-0000000004b2 22225 1726882770.71747: WORKER PROCESS EXITING 22225 1726882770.73160: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882770.75437: done with get_vars() 22225 1726882770.75475: done getting variables 22225 1726882770.75549: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 21:39:30 -0400 (0:00:00.069) 0:00:26.148 ****** 22225 1726882770.75594: entering _queue_task() for managed_node1/shell 22225 1726882770.76047: worker is 1 (out of 1 available) 22225 1726882770.76062: exiting _queue_task() for managed_node1/shell 22225 1726882770.76074: done queuing things up, now waiting for results queue to drain 22225 1726882770.76076: waiting for pending results... 22225 1726882770.76343: running TaskExecutor() for managed_node1/TASK: Get NM profile info 22225 1726882770.76492: in run() - task 0affc7ec-ae25-ec05-55b7-0000000004b3 22225 1726882770.76516: variable 'ansible_search_path' from source: unknown 22225 1726882770.76528: variable 'ansible_search_path' from source: unknown 22225 1726882770.76583: calling self._execute() 22225 1726882770.76694: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882770.76708: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882770.76766: variable 'omit' from source: magic vars 22225 1726882770.77167: variable 'ansible_distribution_major_version' from source: facts 22225 1726882770.77188: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882770.77206: variable 'omit' from source: magic vars 22225 1726882770.77271: variable 'omit' from source: magic vars 22225 1726882770.77395: variable 'profile' from source: include params 22225 1726882770.77406: variable 'interface' from source: play vars 22225 1726882770.77530: variable 'interface' from source: play vars 22225 1726882770.77533: variable 'omit' from source: magic vars 22225 1726882770.77571: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22225 1726882770.77617: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22225 1726882770.77654: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22225 1726882770.77678: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882770.77698: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882770.77748: variable 'inventory_hostname' from source: host vars for 'managed_node1' 22225 1726882770.77752: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882770.77829: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882770.77882: Set connection var ansible_connection to ssh 22225 1726882770.77901: Set connection var ansible_pipelining to False 22225 1726882770.77916: Set connection var ansible_shell_executable to /bin/sh 22225 1726882770.77930: Set connection var ansible_timeout to 10 22225 1726882770.77943: Set connection var ansible_shell_type to sh 22225 1726882770.77966: Set connection var ansible_module_compression to ZIP_DEFLATED 22225 1726882770.77991: variable 'ansible_shell_executable' from source: unknown 22225 1726882770.78052: variable 'ansible_connection' from source: unknown 22225 1726882770.78056: variable 'ansible_module_compression' from source: unknown 22225 1726882770.78058: variable 'ansible_shell_type' from source: unknown 22225 1726882770.78061: variable 'ansible_shell_executable' from source: unknown 22225 1726882770.78063: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882770.78065: variable 'ansible_pipelining' from source: unknown 22225 1726882770.78072: variable 'ansible_timeout' from source: unknown 22225 1726882770.78075: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882770.78220: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22225 1726882770.78241: variable 'omit' from source: magic vars 22225 1726882770.78253: starting attempt loop 22225 1726882770.78261: running the handler 22225 1726882770.78295: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22225 1726882770.78315: _low_level_execute_command(): starting 22225 1726882770.78331: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22225 1726882770.79263: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882770.79284: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882770.79305: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882770.79403: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882770.81146: stdout chunk (state=3): >>>/root <<< 22225 1726882770.81349: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882770.81353: stdout chunk (state=3): >>><<< 22225 1726882770.81356: stderr chunk (state=3): >>><<< 22225 1726882770.81379: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882770.81404: _low_level_execute_command(): starting 22225 1726882770.81456: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882770.813894-23156-253874481137850 `" && echo ansible-tmp-1726882770.813894-23156-253874481137850="` echo /root/.ansible/tmp/ansible-tmp-1726882770.813894-23156-253874481137850 `" ) && sleep 0' 22225 1726882770.82134: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22225 1726882770.82242: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882770.82267: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882770.82296: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882770.82391: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882770.84351: stdout chunk (state=3): >>>ansible-tmp-1726882770.813894-23156-253874481137850=/root/.ansible/tmp/ansible-tmp-1726882770.813894-23156-253874481137850 <<< 22225 1726882770.84561: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882770.84564: stdout chunk (state=3): >>><<< 22225 1726882770.84567: stderr chunk (state=3): >>><<< 22225 1726882770.84729: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882770.813894-23156-253874481137850=/root/.ansible/tmp/ansible-tmp-1726882770.813894-23156-253874481137850 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882770.84733: variable 'ansible_module_compression' from source: unknown 22225 1726882770.84735: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-2222502xecypi/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 22225 1726882770.84738: variable 'ansible_facts' from source: unknown 22225 1726882770.84826: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882770.813894-23156-253874481137850/AnsiballZ_command.py 22225 1726882770.84982: Sending initial data 22225 1726882770.85086: Sent initial data (155 bytes) 22225 1726882770.85703: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22225 1726882770.85719: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882770.85744: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882770.85766: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882770.85856: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882770.85900: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882770.85925: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882770.86012: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882770.87605: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22225 1726882770.87674: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22225 1726882770.87765: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2222502xecypi/tmpdn_u73mw /root/.ansible/tmp/ansible-tmp-1726882770.813894-23156-253874481137850/AnsiballZ_command.py <<< 22225 1726882770.87768: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882770.813894-23156-253874481137850/AnsiballZ_command.py" <<< 22225 1726882770.87816: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-2222502xecypi/tmpdn_u73mw" to remote "/root/.ansible/tmp/ansible-tmp-1726882770.813894-23156-253874481137850/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882770.813894-23156-253874481137850/AnsiballZ_command.py" <<< 22225 1726882770.88654: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882770.88691: stderr chunk (state=3): >>><<< 22225 1726882770.88798: stdout chunk (state=3): >>><<< 22225 1726882770.88802: done transferring module to remote 22225 1726882770.88804: _low_level_execute_command(): starting 22225 1726882770.88807: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882770.813894-23156-253874481137850/ /root/.ansible/tmp/ansible-tmp-1726882770.813894-23156-253874481137850/AnsiballZ_command.py && sleep 0' 22225 1726882770.89411: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22225 1726882770.89428: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882770.89448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882770.89471: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882770.89488: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 <<< 22225 1726882770.89590: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882770.89620: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882770.89702: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882770.91600: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882770.91604: stdout chunk (state=3): >>><<< 22225 1726882770.91607: stderr chunk (state=3): >>><<< 22225 1726882770.91630: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882770.91725: _low_level_execute_command(): starting 22225 1726882770.91730: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882770.813894-23156-253874481137850/AnsiballZ_command.py && sleep 0' 22225 1726882770.92314: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22225 1726882770.92330: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882770.92344: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882770.92370: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882770.92478: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882770.92501: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882770.92594: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882771.10863: stdout chunk (state=3): >>> {"changed": true, "stdout": "veth0 /etc/NetworkManager/system-connections/veth0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "start": "2024-09-20 21:39:31.087233", "end": "2024-09-20 21:39:31.106117", "delta": "0:00:00.018884", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 22225 1726882771.12693: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. <<< 22225 1726882771.12697: stdout chunk (state=3): >>><<< 22225 1726882771.12700: stderr chunk (state=3): >>><<< 22225 1726882771.12702: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "veth0 /etc/NetworkManager/system-connections/veth0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "start": "2024-09-20 21:39:31.087233", "end": "2024-09-20 21:39:31.106117", "delta": "0:00:00.018884", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. 22225 1726882771.12707: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882770.813894-23156-253874481137850/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22225 1726882771.12709: _low_level_execute_command(): starting 22225 1726882771.12711: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882770.813894-23156-253874481137850/ > /dev/null 2>&1 && sleep 0' 22225 1726882771.13846: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22225 1726882771.13861: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882771.13874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882771.14087: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882771.14115: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882771.14261: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882771.14300: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882771.16347: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882771.16358: stdout chunk (state=3): >>><<< 22225 1726882771.16371: stderr chunk (state=3): >>><<< 22225 1726882771.16541: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882771.16545: handler run complete 22225 1726882771.16547: Evaluated conditional (False): False 22225 1726882771.16550: attempt loop complete, returning result 22225 1726882771.16552: _execute() done 22225 1726882771.16554: dumping result to json 22225 1726882771.16556: done dumping result, returning 22225 1726882771.16558: done running TaskExecutor() for managed_node1/TASK: Get NM profile info [0affc7ec-ae25-ec05-55b7-0000000004b3] 22225 1726882771.16560: sending task result for task 0affc7ec-ae25-ec05-55b7-0000000004b3 ok: [managed_node1] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "delta": "0:00:00.018884", "end": "2024-09-20 21:39:31.106117", "rc": 0, "start": "2024-09-20 21:39:31.087233" } STDOUT: veth0 /etc/NetworkManager/system-connections/veth0.nmconnection 22225 1726882771.17048: no more pending results, returning what we have 22225 1726882771.17053: results queue empty 22225 1726882771.17054: checking for any_errors_fatal 22225 1726882771.17061: done checking for any_errors_fatal 22225 1726882771.17062: checking for max_fail_percentage 22225 1726882771.17064: done checking for max_fail_percentage 22225 1726882771.17065: checking to see if all hosts have failed and the running result is not ok 22225 1726882771.17067: done checking to see if all hosts have failed 22225 1726882771.17290: getting the remaining hosts for this loop 22225 1726882771.17293: done getting the remaining hosts for this loop 22225 1726882771.17298: getting the next task for host managed_node1 22225 1726882771.17305: done getting next task for host managed_node1 22225 1726882771.17308: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 22225 1726882771.17313: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882771.17317: getting variables 22225 1726882771.17318: in VariableManager get_vars() 22225 1726882771.17366: Calling all_inventory to load vars for managed_node1 22225 1726882771.17369: Calling groups_inventory to load vars for managed_node1 22225 1726882771.17371: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882771.17384: Calling all_plugins_play to load vars for managed_node1 22225 1726882771.17388: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882771.17391: Calling groups_plugins_play to load vars for managed_node1 22225 1726882771.18043: done sending task result for task 0affc7ec-ae25-ec05-55b7-0000000004b3 22225 1726882771.18047: WORKER PROCESS EXITING 22225 1726882771.21598: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882771.26314: done with get_vars() 22225 1726882771.26344: done getting variables 22225 1726882771.26407: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 21:39:31 -0400 (0:00:00.509) 0:00:26.658 ****** 22225 1726882771.26548: entering _queue_task() for managed_node1/set_fact 22225 1726882771.27105: worker is 1 (out of 1 available) 22225 1726882771.27119: exiting _queue_task() for managed_node1/set_fact 22225 1726882771.27437: done queuing things up, now waiting for results queue to drain 22225 1726882771.27439: waiting for pending results... 22225 1726882771.27673: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 22225 1726882771.27773: in run() - task 0affc7ec-ae25-ec05-55b7-0000000004b4 22225 1726882771.27791: variable 'ansible_search_path' from source: unknown 22225 1726882771.27795: variable 'ansible_search_path' from source: unknown 22225 1726882771.28060: calling self._execute() 22225 1726882771.28266: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882771.28273: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882771.28286: variable 'omit' from source: magic vars 22225 1726882771.29330: variable 'ansible_distribution_major_version' from source: facts 22225 1726882771.29334: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882771.29557: variable 'nm_profile_exists' from source: set_fact 22225 1726882771.29562: Evaluated conditional (nm_profile_exists.rc == 0): True 22225 1726882771.29565: variable 'omit' from source: magic vars 22225 1726882771.29619: variable 'omit' from source: magic vars 22225 1726882771.29656: variable 'omit' from source: magic vars 22225 1726882771.29702: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22225 1726882771.29961: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22225 1726882771.29968: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22225 1726882771.29989: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882771.30001: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882771.30035: variable 'inventory_hostname' from source: host vars for 'managed_node1' 22225 1726882771.30038: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882771.30041: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882771.30350: Set connection var ansible_connection to ssh 22225 1726882771.30360: Set connection var ansible_pipelining to False 22225 1726882771.30368: Set connection var ansible_shell_executable to /bin/sh 22225 1726882771.30374: Set connection var ansible_timeout to 10 22225 1726882771.30376: Set connection var ansible_shell_type to sh 22225 1726882771.30385: Set connection var ansible_module_compression to ZIP_DEFLATED 22225 1726882771.30408: variable 'ansible_shell_executable' from source: unknown 22225 1726882771.30411: variable 'ansible_connection' from source: unknown 22225 1726882771.30414: variable 'ansible_module_compression' from source: unknown 22225 1726882771.30416: variable 'ansible_shell_type' from source: unknown 22225 1726882771.30419: variable 'ansible_shell_executable' from source: unknown 22225 1726882771.30421: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882771.30428: variable 'ansible_pipelining' from source: unknown 22225 1726882771.30430: variable 'ansible_timeout' from source: unknown 22225 1726882771.30436: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882771.30883: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22225 1726882771.30886: variable 'omit' from source: magic vars 22225 1726882771.30889: starting attempt loop 22225 1726882771.30892: running the handler 22225 1726882771.30894: handler run complete 22225 1726882771.30897: attempt loop complete, returning result 22225 1726882771.30899: _execute() done 22225 1726882771.30901: dumping result to json 22225 1726882771.30904: done dumping result, returning 22225 1726882771.30906: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0affc7ec-ae25-ec05-55b7-0000000004b4] 22225 1726882771.30908: sending task result for task 0affc7ec-ae25-ec05-55b7-0000000004b4 22225 1726882771.30979: done sending task result for task 0affc7ec-ae25-ec05-55b7-0000000004b4 22225 1726882771.30985: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 22225 1726882771.31050: no more pending results, returning what we have 22225 1726882771.31062: results queue empty 22225 1726882771.31065: checking for any_errors_fatal 22225 1726882771.31075: done checking for any_errors_fatal 22225 1726882771.31076: checking for max_fail_percentage 22225 1726882771.31078: done checking for max_fail_percentage 22225 1726882771.31079: checking to see if all hosts have failed and the running result is not ok 22225 1726882771.31080: done checking to see if all hosts have failed 22225 1726882771.31080: getting the remaining hosts for this loop 22225 1726882771.31082: done getting the remaining hosts for this loop 22225 1726882771.31087: getting the next task for host managed_node1 22225 1726882771.31098: done getting next task for host managed_node1 22225 1726882771.31101: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 22225 1726882771.31107: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882771.31112: getting variables 22225 1726882771.31113: in VariableManager get_vars() 22225 1726882771.31159: Calling all_inventory to load vars for managed_node1 22225 1726882771.31162: Calling groups_inventory to load vars for managed_node1 22225 1726882771.31164: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882771.31176: Calling all_plugins_play to load vars for managed_node1 22225 1726882771.31179: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882771.31182: Calling groups_plugins_play to load vars for managed_node1 22225 1726882771.35157: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882771.38310: done with get_vars() 22225 1726882771.38340: done getting variables 22225 1726882771.38404: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 22225 1726882771.38537: variable 'profile' from source: include params 22225 1726882771.38541: variable 'interface' from source: play vars 22225 1726882771.38611: variable 'interface' from source: play vars TASK [Get the ansible_managed comment in ifcfg-veth0] ************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 21:39:31 -0400 (0:00:00.121) 0:00:26.779 ****** 22225 1726882771.38654: entering _queue_task() for managed_node1/command 22225 1726882771.39020: worker is 1 (out of 1 available) 22225 1726882771.39237: exiting _queue_task() for managed_node1/command 22225 1726882771.39248: done queuing things up, now waiting for results queue to drain 22225 1726882771.39249: waiting for pending results... 22225 1726882771.39412: running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-veth0 22225 1726882771.39474: in run() - task 0affc7ec-ae25-ec05-55b7-0000000004b6 22225 1726882771.39598: variable 'ansible_search_path' from source: unknown 22225 1726882771.39603: variable 'ansible_search_path' from source: unknown 22225 1726882771.39606: calling self._execute() 22225 1726882771.39931: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882771.39935: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882771.39938: variable 'omit' from source: magic vars 22225 1726882771.40646: variable 'ansible_distribution_major_version' from source: facts 22225 1726882771.40658: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882771.40882: variable 'profile_stat' from source: set_fact 22225 1726882771.40908: Evaluated conditional (profile_stat.stat.exists): False 22225 1726882771.40926: when evaluation is False, skipping this task 22225 1726882771.40930: _execute() done 22225 1726882771.40933: dumping result to json 22225 1726882771.40936: done dumping result, returning 22225 1726882771.40942: done running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-veth0 [0affc7ec-ae25-ec05-55b7-0000000004b6] 22225 1726882771.40949: sending task result for task 0affc7ec-ae25-ec05-55b7-0000000004b6 skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 22225 1726882771.41103: no more pending results, returning what we have 22225 1726882771.41108: results queue empty 22225 1726882771.41109: checking for any_errors_fatal 22225 1726882771.41117: done checking for any_errors_fatal 22225 1726882771.41118: checking for max_fail_percentage 22225 1726882771.41120: done checking for max_fail_percentage 22225 1726882771.41121: checking to see if all hosts have failed and the running result is not ok 22225 1726882771.41124: done checking to see if all hosts have failed 22225 1726882771.41124: getting the remaining hosts for this loop 22225 1726882771.41126: done getting the remaining hosts for this loop 22225 1726882771.41131: getting the next task for host managed_node1 22225 1726882771.41137: done getting next task for host managed_node1 22225 1726882771.41140: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 22225 1726882771.41145: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882771.41150: getting variables 22225 1726882771.41151: in VariableManager get_vars() 22225 1726882771.41196: Calling all_inventory to load vars for managed_node1 22225 1726882771.41199: Calling groups_inventory to load vars for managed_node1 22225 1726882771.41201: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882771.41215: Calling all_plugins_play to load vars for managed_node1 22225 1726882771.41217: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882771.41219: Calling groups_plugins_play to load vars for managed_node1 22225 1726882771.41930: done sending task result for task 0affc7ec-ae25-ec05-55b7-0000000004b6 22225 1726882771.41934: WORKER PROCESS EXITING 22225 1726882771.44178: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882771.46388: done with get_vars() 22225 1726882771.46417: done getting variables 22225 1726882771.46489: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 22225 1726882771.46619: variable 'profile' from source: include params 22225 1726882771.46626: variable 'interface' from source: play vars 22225 1726882771.46688: variable 'interface' from source: play vars TASK [Verify the ansible_managed comment in ifcfg-veth0] *********************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 21:39:31 -0400 (0:00:00.080) 0:00:26.859 ****** 22225 1726882771.46728: entering _queue_task() for managed_node1/set_fact 22225 1726882771.47093: worker is 1 (out of 1 available) 22225 1726882771.47107: exiting _queue_task() for managed_node1/set_fact 22225 1726882771.47120: done queuing things up, now waiting for results queue to drain 22225 1726882771.47121: waiting for pending results... 22225 1726882771.47426: running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-veth0 22225 1726882771.47577: in run() - task 0affc7ec-ae25-ec05-55b7-0000000004b7 22225 1726882771.47597: variable 'ansible_search_path' from source: unknown 22225 1726882771.47604: variable 'ansible_search_path' from source: unknown 22225 1726882771.47648: calling self._execute() 22225 1726882771.47763: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882771.47780: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882771.47794: variable 'omit' from source: magic vars 22225 1726882771.48182: variable 'ansible_distribution_major_version' from source: facts 22225 1726882771.48315: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882771.48344: variable 'profile_stat' from source: set_fact 22225 1726882771.48362: Evaluated conditional (profile_stat.stat.exists): False 22225 1726882771.48369: when evaluation is False, skipping this task 22225 1726882771.48376: _execute() done 22225 1726882771.48383: dumping result to json 22225 1726882771.48390: done dumping result, returning 22225 1726882771.48425: done running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-veth0 [0affc7ec-ae25-ec05-55b7-0000000004b7] 22225 1726882771.48431: sending task result for task 0affc7ec-ae25-ec05-55b7-0000000004b7 skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 22225 1726882771.48592: no more pending results, returning what we have 22225 1726882771.48597: results queue empty 22225 1726882771.48599: checking for any_errors_fatal 22225 1726882771.48607: done checking for any_errors_fatal 22225 1726882771.48608: checking for max_fail_percentage 22225 1726882771.48610: done checking for max_fail_percentage 22225 1726882771.48611: checking to see if all hosts have failed and the running result is not ok 22225 1726882771.48612: done checking to see if all hosts have failed 22225 1726882771.48612: getting the remaining hosts for this loop 22225 1726882771.48615: done getting the remaining hosts for this loop 22225 1726882771.48620: getting the next task for host managed_node1 22225 1726882771.48628: done getting next task for host managed_node1 22225 1726882771.48632: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 22225 1726882771.48729: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882771.48736: getting variables 22225 1726882771.48737: in VariableManager get_vars() 22225 1726882771.48783: Calling all_inventory to load vars for managed_node1 22225 1726882771.48786: Calling groups_inventory to load vars for managed_node1 22225 1726882771.48789: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882771.48804: Calling all_plugins_play to load vars for managed_node1 22225 1726882771.48807: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882771.48810: Calling groups_plugins_play to load vars for managed_node1 22225 1726882771.49339: done sending task result for task 0affc7ec-ae25-ec05-55b7-0000000004b7 22225 1726882771.49343: WORKER PROCESS EXITING 22225 1726882771.58300: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882771.61566: done with get_vars() 22225 1726882771.61597: done getting variables 22225 1726882771.61677: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 22225 1726882771.61800: variable 'profile' from source: include params 22225 1726882771.61803: variable 'interface' from source: play vars 22225 1726882771.61868: variable 'interface' from source: play vars TASK [Get the fingerprint comment in ifcfg-veth0] ****************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 21:39:31 -0400 (0:00:00.151) 0:00:27.011 ****** 22225 1726882771.61899: entering _queue_task() for managed_node1/command 22225 1726882771.62442: worker is 1 (out of 1 available) 22225 1726882771.62527: exiting _queue_task() for managed_node1/command 22225 1726882771.62538: done queuing things up, now waiting for results queue to drain 22225 1726882771.62540: waiting for pending results... 22225 1726882771.62714: running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-veth0 22225 1726882771.62894: in run() - task 0affc7ec-ae25-ec05-55b7-0000000004b8 22225 1726882771.62928: variable 'ansible_search_path' from source: unknown 22225 1726882771.62938: variable 'ansible_search_path' from source: unknown 22225 1726882771.62989: calling self._execute() 22225 1726882771.63140: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882771.63144: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882771.63201: variable 'omit' from source: magic vars 22225 1726882771.64174: variable 'ansible_distribution_major_version' from source: facts 22225 1726882771.64183: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882771.64510: variable 'profile_stat' from source: set_fact 22225 1726882771.64675: Evaluated conditional (profile_stat.stat.exists): False 22225 1726882771.64701: when evaluation is False, skipping this task 22225 1726882771.64741: _execute() done 22225 1726882771.64751: dumping result to json 22225 1726882771.64836: done dumping result, returning 22225 1726882771.64885: done running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-veth0 [0affc7ec-ae25-ec05-55b7-0000000004b8] 22225 1726882771.64958: sending task result for task 0affc7ec-ae25-ec05-55b7-0000000004b8 22225 1726882771.65148: done sending task result for task 0affc7ec-ae25-ec05-55b7-0000000004b8 22225 1726882771.65152: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 22225 1726882771.65229: no more pending results, returning what we have 22225 1726882771.65234: results queue empty 22225 1726882771.65235: checking for any_errors_fatal 22225 1726882771.65243: done checking for any_errors_fatal 22225 1726882771.65244: checking for max_fail_percentage 22225 1726882771.65247: done checking for max_fail_percentage 22225 1726882771.65248: checking to see if all hosts have failed and the running result is not ok 22225 1726882771.65249: done checking to see if all hosts have failed 22225 1726882771.65250: getting the remaining hosts for this loop 22225 1726882771.65252: done getting the remaining hosts for this loop 22225 1726882771.65256: getting the next task for host managed_node1 22225 1726882771.65264: done getting next task for host managed_node1 22225 1726882771.65267: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 22225 1726882771.65272: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882771.65276: getting variables 22225 1726882771.65278: in VariableManager get_vars() 22225 1726882771.65544: Calling all_inventory to load vars for managed_node1 22225 1726882771.65547: Calling groups_inventory to load vars for managed_node1 22225 1726882771.65550: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882771.65564: Calling all_plugins_play to load vars for managed_node1 22225 1726882771.65568: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882771.65571: Calling groups_plugins_play to load vars for managed_node1 22225 1726882771.67734: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882771.70021: done with get_vars() 22225 1726882771.70054: done getting variables 22225 1726882771.70118: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 22225 1726882771.70245: variable 'profile' from source: include params 22225 1726882771.70249: variable 'interface' from source: play vars 22225 1726882771.70321: variable 'interface' from source: play vars TASK [Verify the fingerprint comment in ifcfg-veth0] *************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 21:39:31 -0400 (0:00:00.084) 0:00:27.096 ****** 22225 1726882771.70358: entering _queue_task() for managed_node1/set_fact 22225 1726882771.70717: worker is 1 (out of 1 available) 22225 1726882771.70932: exiting _queue_task() for managed_node1/set_fact 22225 1726882771.70944: done queuing things up, now waiting for results queue to drain 22225 1726882771.70945: waiting for pending results... 22225 1726882771.71250: running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-veth0 22225 1726882771.71258: in run() - task 0affc7ec-ae25-ec05-55b7-0000000004b9 22225 1726882771.71262: variable 'ansible_search_path' from source: unknown 22225 1726882771.71264: variable 'ansible_search_path' from source: unknown 22225 1726882771.71308: calling self._execute() 22225 1726882771.71442: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882771.71460: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882771.71527: variable 'omit' from source: magic vars 22225 1726882771.71936: variable 'ansible_distribution_major_version' from source: facts 22225 1726882771.71956: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882771.72103: variable 'profile_stat' from source: set_fact 22225 1726882771.72135: Evaluated conditional (profile_stat.stat.exists): False 22225 1726882771.72144: when evaluation is False, skipping this task 22225 1726882771.72152: _execute() done 22225 1726882771.72225: dumping result to json 22225 1726882771.72230: done dumping result, returning 22225 1726882771.72233: done running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-veth0 [0affc7ec-ae25-ec05-55b7-0000000004b9] 22225 1726882771.72235: sending task result for task 0affc7ec-ae25-ec05-55b7-0000000004b9 22225 1726882771.72316: done sending task result for task 0affc7ec-ae25-ec05-55b7-0000000004b9 22225 1726882771.72319: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 22225 1726882771.72371: no more pending results, returning what we have 22225 1726882771.72377: results queue empty 22225 1726882771.72378: checking for any_errors_fatal 22225 1726882771.72387: done checking for any_errors_fatal 22225 1726882771.72388: checking for max_fail_percentage 22225 1726882771.72390: done checking for max_fail_percentage 22225 1726882771.72392: checking to see if all hosts have failed and the running result is not ok 22225 1726882771.72393: done checking to see if all hosts have failed 22225 1726882771.72394: getting the remaining hosts for this loop 22225 1726882771.72396: done getting the remaining hosts for this loop 22225 1726882771.72401: getting the next task for host managed_node1 22225 1726882771.72412: done getting next task for host managed_node1 22225 1726882771.72415: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 22225 1726882771.72418: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882771.72425: getting variables 22225 1726882771.72427: in VariableManager get_vars() 22225 1726882771.72478: Calling all_inventory to load vars for managed_node1 22225 1726882771.72483: Calling groups_inventory to load vars for managed_node1 22225 1726882771.72486: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882771.72501: Calling all_plugins_play to load vars for managed_node1 22225 1726882771.72505: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882771.72508: Calling groups_plugins_play to load vars for managed_node1 22225 1726882771.74593: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882771.76715: done with get_vars() 22225 1726882771.76745: done getting variables 22225 1726882771.76815: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 22225 1726882771.76953: variable 'profile' from source: include params 22225 1726882771.76957: variable 'interface' from source: play vars 22225 1726882771.77026: variable 'interface' from source: play vars TASK [Assert that the profile is present - 'veth0'] **************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 21:39:31 -0400 (0:00:00.066) 0:00:27.163 ****** 22225 1726882771.77059: entering _queue_task() for managed_node1/assert 22225 1726882771.77482: worker is 1 (out of 1 available) 22225 1726882771.77496: exiting _queue_task() for managed_node1/assert 22225 1726882771.77507: done queuing things up, now waiting for results queue to drain 22225 1726882771.77509: waiting for pending results... 22225 1726882771.77961: running TaskExecutor() for managed_node1/TASK: Assert that the profile is present - 'veth0' 22225 1726882771.77967: in run() - task 0affc7ec-ae25-ec05-55b7-0000000003b9 22225 1726882771.77969: variable 'ansible_search_path' from source: unknown 22225 1726882771.77972: variable 'ansible_search_path' from source: unknown 22225 1726882771.77986: calling self._execute() 22225 1726882771.78109: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882771.78124: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882771.78140: variable 'omit' from source: magic vars 22225 1726882771.78587: variable 'ansible_distribution_major_version' from source: facts 22225 1726882771.78629: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882771.78634: variable 'omit' from source: magic vars 22225 1726882771.78673: variable 'omit' from source: magic vars 22225 1726882771.78814: variable 'profile' from source: include params 22225 1726882771.78818: variable 'interface' from source: play vars 22225 1726882771.78884: variable 'interface' from source: play vars 22225 1726882771.78922: variable 'omit' from source: magic vars 22225 1726882771.79030: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22225 1726882771.79034: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22225 1726882771.79044: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22225 1726882771.79073: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882771.79094: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882771.79134: variable 'inventory_hostname' from source: host vars for 'managed_node1' 22225 1726882771.79146: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882771.79155: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882771.79287: Set connection var ansible_connection to ssh 22225 1726882771.79358: Set connection var ansible_pipelining to False 22225 1726882771.79361: Set connection var ansible_shell_executable to /bin/sh 22225 1726882771.79364: Set connection var ansible_timeout to 10 22225 1726882771.79366: Set connection var ansible_shell_type to sh 22225 1726882771.79369: Set connection var ansible_module_compression to ZIP_DEFLATED 22225 1726882771.79375: variable 'ansible_shell_executable' from source: unknown 22225 1726882771.79387: variable 'ansible_connection' from source: unknown 22225 1726882771.79401: variable 'ansible_module_compression' from source: unknown 22225 1726882771.79409: variable 'ansible_shell_type' from source: unknown 22225 1726882771.79415: variable 'ansible_shell_executable' from source: unknown 22225 1726882771.79421: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882771.79431: variable 'ansible_pipelining' from source: unknown 22225 1726882771.79437: variable 'ansible_timeout' from source: unknown 22225 1726882771.79466: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882771.79616: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22225 1726882771.79639: variable 'omit' from source: magic vars 22225 1726882771.79686: starting attempt loop 22225 1726882771.79689: running the handler 22225 1726882771.79785: variable 'lsr_net_profile_exists' from source: set_fact 22225 1726882771.79800: Evaluated conditional (lsr_net_profile_exists): True 22225 1726882771.79809: handler run complete 22225 1726882771.79834: attempt loop complete, returning result 22225 1726882771.79900: _execute() done 22225 1726882771.79904: dumping result to json 22225 1726882771.79907: done dumping result, returning 22225 1726882771.79910: done running TaskExecutor() for managed_node1/TASK: Assert that the profile is present - 'veth0' [0affc7ec-ae25-ec05-55b7-0000000003b9] 22225 1726882771.79912: sending task result for task 0affc7ec-ae25-ec05-55b7-0000000003b9 ok: [managed_node1] => { "changed": false } MSG: All assertions passed 22225 1726882771.80099: no more pending results, returning what we have 22225 1726882771.80103: results queue empty 22225 1726882771.80104: checking for any_errors_fatal 22225 1726882771.80112: done checking for any_errors_fatal 22225 1726882771.80113: checking for max_fail_percentage 22225 1726882771.80114: done checking for max_fail_percentage 22225 1726882771.80116: checking to see if all hosts have failed and the running result is not ok 22225 1726882771.80116: done checking to see if all hosts have failed 22225 1726882771.80117: getting the remaining hosts for this loop 22225 1726882771.80119: done getting the remaining hosts for this loop 22225 1726882771.80126: getting the next task for host managed_node1 22225 1726882771.80135: done getting next task for host managed_node1 22225 1726882771.80138: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 22225 1726882771.80142: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882771.80147: getting variables 22225 1726882771.80149: in VariableManager get_vars() 22225 1726882771.80203: Calling all_inventory to load vars for managed_node1 22225 1726882771.80206: Calling groups_inventory to load vars for managed_node1 22225 1726882771.80209: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882771.80426: Calling all_plugins_play to load vars for managed_node1 22225 1726882771.80431: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882771.80436: Calling groups_plugins_play to load vars for managed_node1 22225 1726882771.81165: done sending task result for task 0affc7ec-ae25-ec05-55b7-0000000003b9 22225 1726882771.81169: WORKER PROCESS EXITING 22225 1726882771.82475: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882771.84753: done with get_vars() 22225 1726882771.84783: done getting variables 22225 1726882771.84854: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 22225 1726882771.84985: variable 'profile' from source: include params 22225 1726882771.84989: variable 'interface' from source: play vars 22225 1726882771.85059: variable 'interface' from source: play vars TASK [Assert that the ansible managed comment is present in 'veth0'] *********** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 21:39:31 -0400 (0:00:00.080) 0:00:27.243 ****** 22225 1726882771.85100: entering _queue_task() for managed_node1/assert 22225 1726882771.85492: worker is 1 (out of 1 available) 22225 1726882771.85505: exiting _queue_task() for managed_node1/assert 22225 1726882771.85517: done queuing things up, now waiting for results queue to drain 22225 1726882771.85518: waiting for pending results... 22225 1726882771.85813: running TaskExecutor() for managed_node1/TASK: Assert that the ansible managed comment is present in 'veth0' 22225 1726882771.85946: in run() - task 0affc7ec-ae25-ec05-55b7-0000000003ba 22225 1726882771.85966: variable 'ansible_search_path' from source: unknown 22225 1726882771.85974: variable 'ansible_search_path' from source: unknown 22225 1726882771.86028: calling self._execute() 22225 1726882771.86146: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882771.86158: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882771.86172: variable 'omit' from source: magic vars 22225 1726882771.86595: variable 'ansible_distribution_major_version' from source: facts 22225 1726882771.86612: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882771.86626: variable 'omit' from source: magic vars 22225 1726882771.86682: variable 'omit' from source: magic vars 22225 1726882771.86800: variable 'profile' from source: include params 22225 1726882771.86811: variable 'interface' from source: play vars 22225 1726882771.86897: variable 'interface' from source: play vars 22225 1726882771.86921: variable 'omit' from source: magic vars 22225 1726882771.86990: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22225 1726882771.87026: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22225 1726882771.87053: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22225 1726882771.87099: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882771.87103: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882771.87141: variable 'inventory_hostname' from source: host vars for 'managed_node1' 22225 1726882771.87208: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882771.87211: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882771.87272: Set connection var ansible_connection to ssh 22225 1726882771.87291: Set connection var ansible_pipelining to False 22225 1726882771.87303: Set connection var ansible_shell_executable to /bin/sh 22225 1726882771.87321: Set connection var ansible_timeout to 10 22225 1726882771.87332: Set connection var ansible_shell_type to sh 22225 1726882771.87342: Set connection var ansible_module_compression to ZIP_DEFLATED 22225 1726882771.87371: variable 'ansible_shell_executable' from source: unknown 22225 1726882771.87381: variable 'ansible_connection' from source: unknown 22225 1726882771.87390: variable 'ansible_module_compression' from source: unknown 22225 1726882771.87424: variable 'ansible_shell_type' from source: unknown 22225 1726882771.87428: variable 'ansible_shell_executable' from source: unknown 22225 1726882771.87431: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882771.87433: variable 'ansible_pipelining' from source: unknown 22225 1726882771.87436: variable 'ansible_timeout' from source: unknown 22225 1726882771.87438: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882771.87592: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22225 1726882771.87642: variable 'omit' from source: magic vars 22225 1726882771.87645: starting attempt loop 22225 1726882771.87647: running the handler 22225 1726882771.87759: variable 'lsr_net_profile_ansible_managed' from source: set_fact 22225 1726882771.87769: Evaluated conditional (lsr_net_profile_ansible_managed): True 22225 1726882771.87784: handler run complete 22225 1726882771.87928: attempt loop complete, returning result 22225 1726882771.87931: _execute() done 22225 1726882771.87934: dumping result to json 22225 1726882771.87937: done dumping result, returning 22225 1726882771.87939: done running TaskExecutor() for managed_node1/TASK: Assert that the ansible managed comment is present in 'veth0' [0affc7ec-ae25-ec05-55b7-0000000003ba] 22225 1726882771.87941: sending task result for task 0affc7ec-ae25-ec05-55b7-0000000003ba 22225 1726882771.88015: done sending task result for task 0affc7ec-ae25-ec05-55b7-0000000003ba 22225 1726882771.88018: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 22225 1726882771.88074: no more pending results, returning what we have 22225 1726882771.88082: results queue empty 22225 1726882771.88083: checking for any_errors_fatal 22225 1726882771.88090: done checking for any_errors_fatal 22225 1726882771.88091: checking for max_fail_percentage 22225 1726882771.88093: done checking for max_fail_percentage 22225 1726882771.88094: checking to see if all hosts have failed and the running result is not ok 22225 1726882771.88096: done checking to see if all hosts have failed 22225 1726882771.88096: getting the remaining hosts for this loop 22225 1726882771.88098: done getting the remaining hosts for this loop 22225 1726882771.88103: getting the next task for host managed_node1 22225 1726882771.88111: done getting next task for host managed_node1 22225 1726882771.88114: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 22225 1726882771.88118: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882771.88124: getting variables 22225 1726882771.88126: in VariableManager get_vars() 22225 1726882771.88175: Calling all_inventory to load vars for managed_node1 22225 1726882771.88178: Calling groups_inventory to load vars for managed_node1 22225 1726882771.88184: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882771.88198: Calling all_plugins_play to load vars for managed_node1 22225 1726882771.88201: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882771.88205: Calling groups_plugins_play to load vars for managed_node1 22225 1726882771.90356: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882771.92605: done with get_vars() 22225 1726882771.92635: done getting variables 22225 1726882771.92706: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 22225 1726882771.92835: variable 'profile' from source: include params 22225 1726882771.92839: variable 'interface' from source: play vars 22225 1726882771.92937: variable 'interface' from source: play vars TASK [Assert that the fingerprint comment is present in veth0] ***************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 21:39:31 -0400 (0:00:00.079) 0:00:27.322 ****** 22225 1726882771.92999: entering _queue_task() for managed_node1/assert 22225 1726882771.93586: worker is 1 (out of 1 available) 22225 1726882771.93600: exiting _queue_task() for managed_node1/assert 22225 1726882771.93610: done queuing things up, now waiting for results queue to drain 22225 1726882771.93612: waiting for pending results... 22225 1726882771.93965: running TaskExecutor() for managed_node1/TASK: Assert that the fingerprint comment is present in veth0 22225 1726882771.93971: in run() - task 0affc7ec-ae25-ec05-55b7-0000000003bb 22225 1726882771.93975: variable 'ansible_search_path' from source: unknown 22225 1726882771.93978: variable 'ansible_search_path' from source: unknown 22225 1726882771.94014: calling self._execute() 22225 1726882771.94139: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882771.94153: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882771.94177: variable 'omit' from source: magic vars 22225 1726882771.94626: variable 'ansible_distribution_major_version' from source: facts 22225 1726882771.94716: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882771.94721: variable 'omit' from source: magic vars 22225 1726882771.94726: variable 'omit' from source: magic vars 22225 1726882771.94823: variable 'profile' from source: include params 22225 1726882771.94839: variable 'interface' from source: play vars 22225 1726882771.94912: variable 'interface' from source: play vars 22225 1726882771.95028: variable 'omit' from source: magic vars 22225 1726882771.95032: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22225 1726882771.95041: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22225 1726882771.95070: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22225 1726882771.95096: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882771.95113: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882771.95172: variable 'inventory_hostname' from source: host vars for 'managed_node1' 22225 1726882771.95183: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882771.95228: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882771.95316: Set connection var ansible_connection to ssh 22225 1726882771.95353: Set connection var ansible_pipelining to False 22225 1726882771.95376: Set connection var ansible_shell_executable to /bin/sh 22225 1726882771.95391: Set connection var ansible_timeout to 10 22225 1726882771.95399: Set connection var ansible_shell_type to sh 22225 1726882771.95410: Set connection var ansible_module_compression to ZIP_DEFLATED 22225 1726882771.95491: variable 'ansible_shell_executable' from source: unknown 22225 1726882771.95587: variable 'ansible_connection' from source: unknown 22225 1726882771.95592: variable 'ansible_module_compression' from source: unknown 22225 1726882771.95595: variable 'ansible_shell_type' from source: unknown 22225 1726882771.95597: variable 'ansible_shell_executable' from source: unknown 22225 1726882771.95599: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882771.95601: variable 'ansible_pipelining' from source: unknown 22225 1726882771.95604: variable 'ansible_timeout' from source: unknown 22225 1726882771.95606: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882771.95747: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22225 1726882771.95811: variable 'omit' from source: magic vars 22225 1726882771.95814: starting attempt loop 22225 1726882771.95817: running the handler 22225 1726882771.95917: variable 'lsr_net_profile_fingerprint' from source: set_fact 22225 1726882771.95929: Evaluated conditional (lsr_net_profile_fingerprint): True 22225 1726882771.95943: handler run complete 22225 1726882771.95962: attempt loop complete, returning result 22225 1726882771.95968: _execute() done 22225 1726882771.95974: dumping result to json 22225 1726882771.96029: done dumping result, returning 22225 1726882771.96032: done running TaskExecutor() for managed_node1/TASK: Assert that the fingerprint comment is present in veth0 [0affc7ec-ae25-ec05-55b7-0000000003bb] 22225 1726882771.96036: sending task result for task 0affc7ec-ae25-ec05-55b7-0000000003bb 22225 1726882771.96228: done sending task result for task 0affc7ec-ae25-ec05-55b7-0000000003bb 22225 1726882771.96232: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 22225 1726882771.96293: no more pending results, returning what we have 22225 1726882771.96296: results queue empty 22225 1726882771.96297: checking for any_errors_fatal 22225 1726882771.96305: done checking for any_errors_fatal 22225 1726882771.96306: checking for max_fail_percentage 22225 1726882771.96308: done checking for max_fail_percentage 22225 1726882771.96309: checking to see if all hosts have failed and the running result is not ok 22225 1726882771.96310: done checking to see if all hosts have failed 22225 1726882771.96311: getting the remaining hosts for this loop 22225 1726882771.96313: done getting the remaining hosts for this loop 22225 1726882771.96317: getting the next task for host managed_node1 22225 1726882771.96328: done getting next task for host managed_node1 22225 1726882771.96332: ^ task is: TASK: Get ip address information 22225 1726882771.96334: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882771.96338: getting variables 22225 1726882771.96424: in VariableManager get_vars() 22225 1726882771.96471: Calling all_inventory to load vars for managed_node1 22225 1726882771.96475: Calling groups_inventory to load vars for managed_node1 22225 1726882771.96478: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882771.96493: Calling all_plugins_play to load vars for managed_node1 22225 1726882771.96497: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882771.96500: Calling groups_plugins_play to load vars for managed_node1 22225 1726882771.99212: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882772.01078: done with get_vars() 22225 1726882772.01098: done getting variables 22225 1726882772.01145: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get ip address information] ********************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:53 Friday 20 September 2024 21:39:32 -0400 (0:00:00.081) 0:00:27.404 ****** 22225 1726882772.01166: entering _queue_task() for managed_node1/command 22225 1726882772.01462: worker is 1 (out of 1 available) 22225 1726882772.01476: exiting _queue_task() for managed_node1/command 22225 1726882772.01490: done queuing things up, now waiting for results queue to drain 22225 1726882772.01491: waiting for pending results... 22225 1726882772.01869: running TaskExecutor() for managed_node1/TASK: Get ip address information 22225 1726882772.02076: in run() - task 0affc7ec-ae25-ec05-55b7-00000000005e 22225 1726882772.02083: variable 'ansible_search_path' from source: unknown 22225 1726882772.02087: calling self._execute() 22225 1726882772.02191: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882772.02206: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882772.02221: variable 'omit' from source: magic vars 22225 1726882772.02811: variable 'ansible_distribution_major_version' from source: facts 22225 1726882772.02825: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882772.02835: variable 'omit' from source: magic vars 22225 1726882772.02873: variable 'omit' from source: magic vars 22225 1726882772.02964: variable 'interface' from source: play vars 22225 1726882772.03061: variable 'omit' from source: magic vars 22225 1726882772.03066: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22225 1726882772.03070: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22225 1726882772.03073: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22225 1726882772.03092: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882772.03104: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882772.03141: variable 'inventory_hostname' from source: host vars for 'managed_node1' 22225 1726882772.03145: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882772.03147: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882772.03249: Set connection var ansible_connection to ssh 22225 1726882772.03260: Set connection var ansible_pipelining to False 22225 1726882772.03269: Set connection var ansible_shell_executable to /bin/sh 22225 1726882772.03277: Set connection var ansible_timeout to 10 22225 1726882772.03282: Set connection var ansible_shell_type to sh 22225 1726882772.03285: Set connection var ansible_module_compression to ZIP_DEFLATED 22225 1726882772.03309: variable 'ansible_shell_executable' from source: unknown 22225 1726882772.03312: variable 'ansible_connection' from source: unknown 22225 1726882772.03316: variable 'ansible_module_compression' from source: unknown 22225 1726882772.03319: variable 'ansible_shell_type' from source: unknown 22225 1726882772.03323: variable 'ansible_shell_executable' from source: unknown 22225 1726882772.03326: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882772.03328: variable 'ansible_pipelining' from source: unknown 22225 1726882772.03331: variable 'ansible_timeout' from source: unknown 22225 1726882772.03390: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882772.03483: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22225 1726882772.03497: variable 'omit' from source: magic vars 22225 1726882772.03500: starting attempt loop 22225 1726882772.03503: running the handler 22225 1726882772.03520: _low_level_execute_command(): starting 22225 1726882772.03530: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22225 1726882772.04188: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882772.04205: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882772.04251: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882772.04254: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882772.04347: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882772.06085: stdout chunk (state=3): >>>/root <<< 22225 1726882772.06197: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882772.06241: stderr chunk (state=3): >>><<< 22225 1726882772.06245: stdout chunk (state=3): >>><<< 22225 1726882772.06264: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882772.06276: _low_level_execute_command(): starting 22225 1726882772.06284: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882772.0626247-23197-277860409039414 `" && echo ansible-tmp-1726882772.0626247-23197-277860409039414="` echo /root/.ansible/tmp/ansible-tmp-1726882772.0626247-23197-277860409039414 `" ) && sleep 0' 22225 1726882772.06755: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882772.06768: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882772.06811: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882772.08792: stdout chunk (state=3): >>>ansible-tmp-1726882772.0626247-23197-277860409039414=/root/.ansible/tmp/ansible-tmp-1726882772.0626247-23197-277860409039414 <<< 22225 1726882772.09127: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882772.09131: stderr chunk (state=3): >>><<< 22225 1726882772.09134: stdout chunk (state=3): >>><<< 22225 1726882772.09137: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882772.0626247-23197-277860409039414=/root/.ansible/tmp/ansible-tmp-1726882772.0626247-23197-277860409039414 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882772.09139: variable 'ansible_module_compression' from source: unknown 22225 1726882772.09142: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-2222502xecypi/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 22225 1726882772.09167: variable 'ansible_facts' from source: unknown 22225 1726882772.09262: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882772.0626247-23197-277860409039414/AnsiballZ_command.py 22225 1726882772.09454: Sending initial data 22225 1726882772.09458: Sent initial data (156 bytes) 22225 1726882772.09961: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882772.09968: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882772.09986: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882772.09990: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882772.10000: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882772.10057: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882772.10064: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882772.10116: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882772.11711: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22225 1726882772.11767: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22225 1726882772.11836: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2222502xecypi/tmpywnvk1x4 /root/.ansible/tmp/ansible-tmp-1726882772.0626247-23197-277860409039414/AnsiballZ_command.py <<< 22225 1726882772.11839: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882772.0626247-23197-277860409039414/AnsiballZ_command.py" <<< 22225 1726882772.11886: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-2222502xecypi/tmpywnvk1x4" to remote "/root/.ansible/tmp/ansible-tmp-1726882772.0626247-23197-277860409039414/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882772.0626247-23197-277860409039414/AnsiballZ_command.py" <<< 22225 1726882772.12694: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882772.12735: stderr chunk (state=3): >>><<< 22225 1726882772.12738: stdout chunk (state=3): >>><<< 22225 1726882772.12755: done transferring module to remote 22225 1726882772.12764: _low_level_execute_command(): starting 22225 1726882772.12767: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882772.0626247-23197-277860409039414/ /root/.ansible/tmp/ansible-tmp-1726882772.0626247-23197-277860409039414/AnsiballZ_command.py && sleep 0' 22225 1726882772.13293: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882772.13297: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882772.13300: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found <<< 22225 1726882772.13303: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882772.13406: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882772.13409: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882772.13461: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882772.15243: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882772.15283: stderr chunk (state=3): >>><<< 22225 1726882772.15292: stdout chunk (state=3): >>><<< 22225 1726882772.15303: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882772.15306: _low_level_execute_command(): starting 22225 1726882772.15309: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882772.0626247-23197-277860409039414/AnsiballZ_command.py && sleep 0' 22225 1726882772.15703: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882772.15710: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882772.15734: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882772.15737: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882772.15795: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882772.15801: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882772.15865: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882772.32551: stdout chunk (state=3): >>> {"changed": true, "stdout": "31: veth0@if30: mtu 1500 qdisc noqueue state UP group default qlen 1000\n link/ether e2:4c:08:99:31:59 brd ff:ff:ff:ff:ff:ff link-netns ns1\n inet6 2001:db8::2/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 2001:db8::3/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 2001:db8::4/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 fe80::e04c:8ff:fe99:3159/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "addr", "show", "veth0"], "start": "2024-09-20 21:39:32.319888", "end": "2024-09-20 21:39:32.323805", "delta": "0:00:00.003917", "msg": "", "invocation": {"module_args": {"_raw_params": "ip addr show veth0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 22225 1726882772.34070: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. <<< 22225 1726882772.34137: stderr chunk (state=3): >>><<< 22225 1726882772.34141: stdout chunk (state=3): >>><<< 22225 1726882772.34158: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "31: veth0@if30: mtu 1500 qdisc noqueue state UP group default qlen 1000\n link/ether e2:4c:08:99:31:59 brd ff:ff:ff:ff:ff:ff link-netns ns1\n inet6 2001:db8::2/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 2001:db8::3/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 2001:db8::4/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 fe80::e04c:8ff:fe99:3159/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "addr", "show", "veth0"], "start": "2024-09-20 21:39:32.319888", "end": "2024-09-20 21:39:32.323805", "delta": "0:00:00.003917", "msg": "", "invocation": {"module_args": {"_raw_params": "ip addr show veth0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. 22225 1726882772.34195: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip addr show veth0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882772.0626247-23197-277860409039414/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22225 1726882772.34203: _low_level_execute_command(): starting 22225 1726882772.34208: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882772.0626247-23197-277860409039414/ > /dev/null 2>&1 && sleep 0' 22225 1726882772.34873: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882772.34877: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882772.34884: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882772.34891: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882772.34927: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882772.34969: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882772.36886: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882772.36955: stderr chunk (state=3): >>><<< 22225 1726882772.36958: stdout chunk (state=3): >>><<< 22225 1726882772.36987: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882772.36990: handler run complete 22225 1726882772.37032: Evaluated conditional (False): False 22225 1726882772.37036: attempt loop complete, returning result 22225 1726882772.37038: _execute() done 22225 1726882772.37044: dumping result to json 22225 1726882772.37050: done dumping result, returning 22225 1726882772.37058: done running TaskExecutor() for managed_node1/TASK: Get ip address information [0affc7ec-ae25-ec05-55b7-00000000005e] 22225 1726882772.37069: sending task result for task 0affc7ec-ae25-ec05-55b7-00000000005e 22225 1726882772.37203: done sending task result for task 0affc7ec-ae25-ec05-55b7-00000000005e 22225 1726882772.37206: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "ip", "addr", "show", "veth0" ], "delta": "0:00:00.003917", "end": "2024-09-20 21:39:32.323805", "rc": 0, "start": "2024-09-20 21:39:32.319888" } STDOUT: 31: veth0@if30: mtu 1500 qdisc noqueue state UP group default qlen 1000 link/ether e2:4c:08:99:31:59 brd ff:ff:ff:ff:ff:ff link-netns ns1 inet6 2001:db8::2/32 scope global noprefixroute valid_lft forever preferred_lft forever inet6 2001:db8::3/32 scope global noprefixroute valid_lft forever preferred_lft forever inet6 2001:db8::4/32 scope global noprefixroute valid_lft forever preferred_lft forever inet6 fe80::e04c:8ff:fe99:3159/64 scope link noprefixroute valid_lft forever preferred_lft forever 22225 1726882772.37320: no more pending results, returning what we have 22225 1726882772.37326: results queue empty 22225 1726882772.37327: checking for any_errors_fatal 22225 1726882772.37336: done checking for any_errors_fatal 22225 1726882772.37337: checking for max_fail_percentage 22225 1726882772.37339: done checking for max_fail_percentage 22225 1726882772.37341: checking to see if all hosts have failed and the running result is not ok 22225 1726882772.37342: done checking to see if all hosts have failed 22225 1726882772.37343: getting the remaining hosts for this loop 22225 1726882772.37344: done getting the remaining hosts for this loop 22225 1726882772.37349: getting the next task for host managed_node1 22225 1726882772.37354: done getting next task for host managed_node1 22225 1726882772.37357: ^ task is: TASK: Show ip_addr 22225 1726882772.37360: ^ state is: HOST STATE: block=3, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882772.37364: getting variables 22225 1726882772.37366: in VariableManager get_vars() 22225 1726882772.37407: Calling all_inventory to load vars for managed_node1 22225 1726882772.37410: Calling groups_inventory to load vars for managed_node1 22225 1726882772.37412: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882772.37456: Calling all_plugins_play to load vars for managed_node1 22225 1726882772.37461: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882772.37464: Calling groups_plugins_play to load vars for managed_node1 22225 1726882772.38992: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882772.40513: done with get_vars() 22225 1726882772.40532: done getting variables 22225 1726882772.40578: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show ip_addr] ************************************************************ task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:57 Friday 20 September 2024 21:39:32 -0400 (0:00:00.394) 0:00:27.798 ****** 22225 1726882772.40624: entering _queue_task() for managed_node1/debug 22225 1726882772.40906: worker is 1 (out of 1 available) 22225 1726882772.40921: exiting _queue_task() for managed_node1/debug 22225 1726882772.40934: done queuing things up, now waiting for results queue to drain 22225 1726882772.40935: waiting for pending results... 22225 1726882772.41165: running TaskExecutor() for managed_node1/TASK: Show ip_addr 22225 1726882772.41235: in run() - task 0affc7ec-ae25-ec05-55b7-00000000005f 22225 1726882772.41254: variable 'ansible_search_path' from source: unknown 22225 1726882772.41301: calling self._execute() 22225 1726882772.41394: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882772.41398: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882772.41420: variable 'omit' from source: magic vars 22225 1726882772.41795: variable 'ansible_distribution_major_version' from source: facts 22225 1726882772.41799: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882772.41803: variable 'omit' from source: magic vars 22225 1726882772.41806: variable 'omit' from source: magic vars 22225 1726882772.41837: variable 'omit' from source: magic vars 22225 1726882772.41878: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22225 1726882772.41911: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22225 1726882772.41929: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22225 1726882772.41965: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882772.41970: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882772.41993: variable 'inventory_hostname' from source: host vars for 'managed_node1' 22225 1726882772.41999: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882772.42002: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882772.42092: Set connection var ansible_connection to ssh 22225 1726882772.42097: Set connection var ansible_pipelining to False 22225 1726882772.42099: Set connection var ansible_shell_executable to /bin/sh 22225 1726882772.42102: Set connection var ansible_timeout to 10 22225 1726882772.42105: Set connection var ansible_shell_type to sh 22225 1726882772.42107: Set connection var ansible_module_compression to ZIP_DEFLATED 22225 1726882772.42134: variable 'ansible_shell_executable' from source: unknown 22225 1726882772.42137: variable 'ansible_connection' from source: unknown 22225 1726882772.42140: variable 'ansible_module_compression' from source: unknown 22225 1726882772.42143: variable 'ansible_shell_type' from source: unknown 22225 1726882772.42146: variable 'ansible_shell_executable' from source: unknown 22225 1726882772.42148: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882772.42150: variable 'ansible_pipelining' from source: unknown 22225 1726882772.42153: variable 'ansible_timeout' from source: unknown 22225 1726882772.42158: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882772.42294: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22225 1726882772.42314: variable 'omit' from source: magic vars 22225 1726882772.42318: starting attempt loop 22225 1726882772.42327: running the handler 22225 1726882772.42449: variable 'ip_addr' from source: set_fact 22225 1726882772.42506: handler run complete 22225 1726882772.42514: attempt loop complete, returning result 22225 1726882772.42517: _execute() done 22225 1726882772.42519: dumping result to json 22225 1726882772.42526: done dumping result, returning 22225 1726882772.42534: done running TaskExecutor() for managed_node1/TASK: Show ip_addr [0affc7ec-ae25-ec05-55b7-00000000005f] 22225 1726882772.42536: sending task result for task 0affc7ec-ae25-ec05-55b7-00000000005f 22225 1726882772.42615: done sending task result for task 0affc7ec-ae25-ec05-55b7-00000000005f 22225 1726882772.42618: WORKER PROCESS EXITING ok: [managed_node1] => { "ip_addr.stdout": "31: veth0@if30: mtu 1500 qdisc noqueue state UP group default qlen 1000\n link/ether e2:4c:08:99:31:59 brd ff:ff:ff:ff:ff:ff link-netns ns1\n inet6 2001:db8::2/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 2001:db8::3/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 2001:db8::4/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 fe80::e04c:8ff:fe99:3159/64 scope link noprefixroute \n valid_lft forever preferred_lft forever" } 22225 1726882772.42690: no more pending results, returning what we have 22225 1726882772.42693: results queue empty 22225 1726882772.42695: checking for any_errors_fatal 22225 1726882772.42701: done checking for any_errors_fatal 22225 1726882772.42702: checking for max_fail_percentage 22225 1726882772.42704: done checking for max_fail_percentage 22225 1726882772.42705: checking to see if all hosts have failed and the running result is not ok 22225 1726882772.42706: done checking to see if all hosts have failed 22225 1726882772.42706: getting the remaining hosts for this loop 22225 1726882772.42708: done getting the remaining hosts for this loop 22225 1726882772.42712: getting the next task for host managed_node1 22225 1726882772.42717: done getting next task for host managed_node1 22225 1726882772.42720: ^ task is: TASK: Assert ipv6 addresses are correctly set 22225 1726882772.42744: ^ state is: HOST STATE: block=3, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882772.42748: getting variables 22225 1726882772.42753: in VariableManager get_vars() 22225 1726882772.42787: Calling all_inventory to load vars for managed_node1 22225 1726882772.42790: Calling groups_inventory to load vars for managed_node1 22225 1726882772.42792: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882772.42801: Calling all_plugins_play to load vars for managed_node1 22225 1726882772.42804: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882772.42806: Calling groups_plugins_play to load vars for managed_node1 22225 1726882772.44067: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882772.45266: done with get_vars() 22225 1726882772.45291: done getting variables 22225 1726882772.45337: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert ipv6 addresses are correctly set] ********************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:60 Friday 20 September 2024 21:39:32 -0400 (0:00:00.047) 0:00:27.846 ****** 22225 1726882772.45357: entering _queue_task() for managed_node1/assert 22225 1726882772.45610: worker is 1 (out of 1 available) 22225 1726882772.45627: exiting _queue_task() for managed_node1/assert 22225 1726882772.45639: done queuing things up, now waiting for results queue to drain 22225 1726882772.45641: waiting for pending results... 22225 1726882772.45858: running TaskExecutor() for managed_node1/TASK: Assert ipv6 addresses are correctly set 22225 1726882772.45942: in run() - task 0affc7ec-ae25-ec05-55b7-000000000060 22225 1726882772.45956: variable 'ansible_search_path' from source: unknown 22225 1726882772.45991: calling self._execute() 22225 1726882772.46070: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882772.46100: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882772.46106: variable 'omit' from source: magic vars 22225 1726882772.46476: variable 'ansible_distribution_major_version' from source: facts 22225 1726882772.46485: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882772.46489: variable 'omit' from source: magic vars 22225 1726882772.46505: variable 'omit' from source: magic vars 22225 1726882772.46539: variable 'omit' from source: magic vars 22225 1726882772.46574: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22225 1726882772.46604: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22225 1726882772.46620: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22225 1726882772.46640: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882772.46649: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882772.46692: variable 'inventory_hostname' from source: host vars for 'managed_node1' 22225 1726882772.46695: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882772.46698: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882772.46764: Set connection var ansible_connection to ssh 22225 1726882772.46773: Set connection var ansible_pipelining to False 22225 1726882772.46785: Set connection var ansible_shell_executable to /bin/sh 22225 1726882772.46788: Set connection var ansible_timeout to 10 22225 1726882772.46790: Set connection var ansible_shell_type to sh 22225 1726882772.46795: Set connection var ansible_module_compression to ZIP_DEFLATED 22225 1726882772.46814: variable 'ansible_shell_executable' from source: unknown 22225 1726882772.46817: variable 'ansible_connection' from source: unknown 22225 1726882772.46819: variable 'ansible_module_compression' from source: unknown 22225 1726882772.46824: variable 'ansible_shell_type' from source: unknown 22225 1726882772.46827: variable 'ansible_shell_executable' from source: unknown 22225 1726882772.46829: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882772.46834: variable 'ansible_pipelining' from source: unknown 22225 1726882772.46837: variable 'ansible_timeout' from source: unknown 22225 1726882772.46843: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882772.46998: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22225 1726882772.47029: variable 'omit' from source: magic vars 22225 1726882772.47034: starting attempt loop 22225 1726882772.47038: running the handler 22225 1726882772.47189: variable 'ip_addr' from source: set_fact 22225 1726882772.47197: Evaluated conditional ('inet6 2001:db8::2/32' in ip_addr.stdout): True 22225 1726882772.47343: variable 'ip_addr' from source: set_fact 22225 1726882772.47350: Evaluated conditional ('inet6 2001:db8::3/32' in ip_addr.stdout): True 22225 1726882772.47471: variable 'ip_addr' from source: set_fact 22225 1726882772.47474: Evaluated conditional ('inet6 2001:db8::4/32' in ip_addr.stdout): True 22225 1726882772.47483: handler run complete 22225 1726882772.47494: attempt loop complete, returning result 22225 1726882772.47548: _execute() done 22225 1726882772.47553: dumping result to json 22225 1726882772.47556: done dumping result, returning 22225 1726882772.47559: done running TaskExecutor() for managed_node1/TASK: Assert ipv6 addresses are correctly set [0affc7ec-ae25-ec05-55b7-000000000060] 22225 1726882772.47561: sending task result for task 0affc7ec-ae25-ec05-55b7-000000000060 22225 1726882772.47631: done sending task result for task 0affc7ec-ae25-ec05-55b7-000000000060 22225 1726882772.47636: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 22225 1726882772.47699: no more pending results, returning what we have 22225 1726882772.47703: results queue empty 22225 1726882772.47703: checking for any_errors_fatal 22225 1726882772.47708: done checking for any_errors_fatal 22225 1726882772.47709: checking for max_fail_percentage 22225 1726882772.47711: done checking for max_fail_percentage 22225 1726882772.47712: checking to see if all hosts have failed and the running result is not ok 22225 1726882772.47713: done checking to see if all hosts have failed 22225 1726882772.47713: getting the remaining hosts for this loop 22225 1726882772.47715: done getting the remaining hosts for this loop 22225 1726882772.47719: getting the next task for host managed_node1 22225 1726882772.47726: done getting next task for host managed_node1 22225 1726882772.47728: ^ task is: TASK: Get ipv6 routes 22225 1726882772.47730: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882772.47732: getting variables 22225 1726882772.47734: in VariableManager get_vars() 22225 1726882772.47770: Calling all_inventory to load vars for managed_node1 22225 1726882772.47773: Calling groups_inventory to load vars for managed_node1 22225 1726882772.47775: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882772.47787: Calling all_plugins_play to load vars for managed_node1 22225 1726882772.47789: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882772.47792: Calling groups_plugins_play to load vars for managed_node1 22225 1726882772.49092: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882772.50689: done with get_vars() 22225 1726882772.50723: done getting variables 22225 1726882772.50809: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get ipv6 routes] ********************************************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:69 Friday 20 September 2024 21:39:32 -0400 (0:00:00.054) 0:00:27.901 ****** 22225 1726882772.50846: entering _queue_task() for managed_node1/command 22225 1726882772.51158: worker is 1 (out of 1 available) 22225 1726882772.51173: exiting _queue_task() for managed_node1/command 22225 1726882772.51188: done queuing things up, now waiting for results queue to drain 22225 1726882772.51190: waiting for pending results... 22225 1726882772.51606: running TaskExecutor() for managed_node1/TASK: Get ipv6 routes 22225 1726882772.51611: in run() - task 0affc7ec-ae25-ec05-55b7-000000000061 22225 1726882772.51620: variable 'ansible_search_path' from source: unknown 22225 1726882772.51650: calling self._execute() 22225 1726882772.51736: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882772.51740: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882772.51757: variable 'omit' from source: magic vars 22225 1726882772.52054: variable 'ansible_distribution_major_version' from source: facts 22225 1726882772.52063: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882772.52070: variable 'omit' from source: magic vars 22225 1726882772.52092: variable 'omit' from source: magic vars 22225 1726882772.52117: variable 'omit' from source: magic vars 22225 1726882772.52153: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22225 1726882772.52188: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22225 1726882772.52206: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22225 1726882772.52221: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882772.52233: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882772.52258: variable 'inventory_hostname' from source: host vars for 'managed_node1' 22225 1726882772.52262: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882772.52265: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882772.52342: Set connection var ansible_connection to ssh 22225 1726882772.52350: Set connection var ansible_pipelining to False 22225 1726882772.52358: Set connection var ansible_shell_executable to /bin/sh 22225 1726882772.52363: Set connection var ansible_timeout to 10 22225 1726882772.52366: Set connection var ansible_shell_type to sh 22225 1726882772.52371: Set connection var ansible_module_compression to ZIP_DEFLATED 22225 1726882772.52393: variable 'ansible_shell_executable' from source: unknown 22225 1726882772.52396: variable 'ansible_connection' from source: unknown 22225 1726882772.52399: variable 'ansible_module_compression' from source: unknown 22225 1726882772.52402: variable 'ansible_shell_type' from source: unknown 22225 1726882772.52405: variable 'ansible_shell_executable' from source: unknown 22225 1726882772.52407: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882772.52411: variable 'ansible_pipelining' from source: unknown 22225 1726882772.52414: variable 'ansible_timeout' from source: unknown 22225 1726882772.52416: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882772.52718: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22225 1726882772.52724: variable 'omit' from source: magic vars 22225 1726882772.52727: starting attempt loop 22225 1726882772.52729: running the handler 22225 1726882772.52732: _low_level_execute_command(): starting 22225 1726882772.52734: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22225 1726882772.53393: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22225 1726882772.53404: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882772.53415: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882772.53437: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882772.53451: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 <<< 22225 1726882772.53460: stderr chunk (state=3): >>>debug2: match not found <<< 22225 1726882772.53514: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882772.53517: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 22225 1726882772.53631: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882772.53637: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882772.53677: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882772.55354: stdout chunk (state=3): >>>/root <<< 22225 1726882772.55534: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882772.55540: stdout chunk (state=3): >>><<< 22225 1726882772.55630: stderr chunk (state=3): >>><<< 22225 1726882772.55649: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882772.55663: _low_level_execute_command(): starting 22225 1726882772.55669: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882772.556484-23217-130656735208194 `" && echo ansible-tmp-1726882772.556484-23217-130656735208194="` echo /root/.ansible/tmp/ansible-tmp-1726882772.556484-23217-130656735208194 `" ) && sleep 0' 22225 1726882772.57064: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22225 1726882772.57073: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882772.57087: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882772.57102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882772.57112: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 <<< 22225 1726882772.57209: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882772.57446: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882772.57521: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882772.59482: stdout chunk (state=3): >>>ansible-tmp-1726882772.556484-23217-130656735208194=/root/.ansible/tmp/ansible-tmp-1726882772.556484-23217-130656735208194 <<< 22225 1726882772.59741: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882772.59744: stdout chunk (state=3): >>><<< 22225 1726882772.59747: stderr chunk (state=3): >>><<< 22225 1726882772.59749: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882772.556484-23217-130656735208194=/root/.ansible/tmp/ansible-tmp-1726882772.556484-23217-130656735208194 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882772.59751: variable 'ansible_module_compression' from source: unknown 22225 1726882772.59775: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-2222502xecypi/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 22225 1726882772.59818: variable 'ansible_facts' from source: unknown 22225 1726882772.59895: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882772.556484-23217-130656735208194/AnsiballZ_command.py 22225 1726882772.60121: Sending initial data 22225 1726882772.60145: Sent initial data (155 bytes) 22225 1726882772.60929: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22225 1726882772.60933: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882772.60935: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882772.60939: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882772.60942: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882772.61044: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882772.61120: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882772.62671: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 22225 1726882772.62684: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 22225 1726882772.62695: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 22225 1726882772.62704: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 22225 1726882772.62720: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22225 1726882772.62787: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22225 1726882772.62861: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2222502xecypi/tmp8_64bxys /root/.ansible/tmp/ansible-tmp-1726882772.556484-23217-130656735208194/AnsiballZ_command.py <<< 22225 1726882772.62873: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882772.556484-23217-130656735208194/AnsiballZ_command.py" <<< 22225 1726882772.62932: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-2222502xecypi/tmp8_64bxys" to remote "/root/.ansible/tmp/ansible-tmp-1726882772.556484-23217-130656735208194/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882772.556484-23217-130656735208194/AnsiballZ_command.py" <<< 22225 1726882772.63793: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882772.63861: stderr chunk (state=3): >>><<< 22225 1726882772.63875: stdout chunk (state=3): >>><<< 22225 1726882772.63992: done transferring module to remote 22225 1726882772.63998: _low_level_execute_command(): starting 22225 1726882772.64001: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882772.556484-23217-130656735208194/ /root/.ansible/tmp/ansible-tmp-1726882772.556484-23217-130656735208194/AnsiballZ_command.py && sleep 0' 22225 1726882772.64514: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22225 1726882772.64552: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found <<< 22225 1726882772.64568: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882772.64640: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882772.64688: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882772.64710: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882772.64744: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882772.64831: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882772.66654: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882772.66728: stderr chunk (state=3): >>><<< 22225 1726882772.66763: stdout chunk (state=3): >>><<< 22225 1726882772.66773: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882772.66871: _low_level_execute_command(): starting 22225 1726882772.66876: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882772.556484-23217-130656735208194/AnsiballZ_command.py && sleep 0' 22225 1726882772.67460: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22225 1726882772.67475: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882772.67489: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882772.67537: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882772.67612: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882772.67634: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882772.67663: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882772.67752: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882772.84405: stdout chunk (state=3): >>> {"changed": true, "stdout": "2001:db8::/32 dev veth0 proto kernel metric 101 pref medium\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nfe80::/64 dev veth0 proto kernel metric 1024 pref medium\ndefault via 2001:db8::1 dev veth0 proto static metric 101 pref medium", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "route"], "start": "2024-09-20 21:39:32.838569", "end": "2024-09-20 21:39:32.842306", "delta": "0:00:00.003737", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 22225 1726882772.85978: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. <<< 22225 1726882772.86029: stderr chunk (state=3): >>><<< 22225 1726882772.86035: stdout chunk (state=3): >>><<< 22225 1726882772.86055: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "2001:db8::/32 dev veth0 proto kernel metric 101 pref medium\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nfe80::/64 dev veth0 proto kernel metric 1024 pref medium\ndefault via 2001:db8::1 dev veth0 proto static metric 101 pref medium", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "route"], "start": "2024-09-20 21:39:32.838569", "end": "2024-09-20 21:39:32.842306", "delta": "0:00:00.003737", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. 22225 1726882772.86085: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -6 route', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882772.556484-23217-130656735208194/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22225 1726882772.86095: _low_level_execute_command(): starting 22225 1726882772.86100: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882772.556484-23217-130656735208194/ > /dev/null 2>&1 && sleep 0' 22225 1726882772.86547: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882772.86551: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882772.86558: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882772.86560: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found <<< 22225 1726882772.86562: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882772.86609: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882772.86616: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882772.86670: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882772.88542: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882772.88584: stderr chunk (state=3): >>><<< 22225 1726882772.88587: stdout chunk (state=3): >>><<< 22225 1726882772.88598: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882772.88604: handler run complete 22225 1726882772.88624: Evaluated conditional (False): False 22225 1726882772.88634: attempt loop complete, returning result 22225 1726882772.88638: _execute() done 22225 1726882772.88641: dumping result to json 22225 1726882772.88647: done dumping result, returning 22225 1726882772.88654: done running TaskExecutor() for managed_node1/TASK: Get ipv6 routes [0affc7ec-ae25-ec05-55b7-000000000061] 22225 1726882772.88663: sending task result for task 0affc7ec-ae25-ec05-55b7-000000000061 22225 1726882772.88764: done sending task result for task 0affc7ec-ae25-ec05-55b7-000000000061 22225 1726882772.88767: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "ip", "-6", "route" ], "delta": "0:00:00.003737", "end": "2024-09-20 21:39:32.842306", "rc": 0, "start": "2024-09-20 21:39:32.838569" } STDOUT: 2001:db8::/32 dev veth0 proto kernel metric 101 pref medium fe80::/64 dev eth0 proto kernel metric 1024 pref medium fe80::/64 dev veth0 proto kernel metric 1024 pref medium default via 2001:db8::1 dev veth0 proto static metric 101 pref medium 22225 1726882772.88852: no more pending results, returning what we have 22225 1726882772.88855: results queue empty 22225 1726882772.88856: checking for any_errors_fatal 22225 1726882772.88865: done checking for any_errors_fatal 22225 1726882772.88865: checking for max_fail_percentage 22225 1726882772.88867: done checking for max_fail_percentage 22225 1726882772.88868: checking to see if all hosts have failed and the running result is not ok 22225 1726882772.88869: done checking to see if all hosts have failed 22225 1726882772.88870: getting the remaining hosts for this loop 22225 1726882772.88871: done getting the remaining hosts for this loop 22225 1726882772.88885: getting the next task for host managed_node1 22225 1726882772.88891: done getting next task for host managed_node1 22225 1726882772.88894: ^ task is: TASK: Show ipv6_route 22225 1726882772.88896: ^ state is: HOST STATE: block=3, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882772.88899: getting variables 22225 1726882772.88900: in VariableManager get_vars() 22225 1726882772.88939: Calling all_inventory to load vars for managed_node1 22225 1726882772.88942: Calling groups_inventory to load vars for managed_node1 22225 1726882772.88944: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882772.88955: Calling all_plugins_play to load vars for managed_node1 22225 1726882772.88958: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882772.88960: Calling groups_plugins_play to load vars for managed_node1 22225 1726882772.90584: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882772.92031: done with get_vars() 22225 1726882772.92053: done getting variables 22225 1726882772.92102: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show ipv6_route] ********************************************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:73 Friday 20 September 2024 21:39:32 -0400 (0:00:00.412) 0:00:28.313 ****** 22225 1726882772.92125: entering _queue_task() for managed_node1/debug 22225 1726882772.92378: worker is 1 (out of 1 available) 22225 1726882772.92396: exiting _queue_task() for managed_node1/debug 22225 1726882772.92407: done queuing things up, now waiting for results queue to drain 22225 1726882772.92409: waiting for pending results... 22225 1726882772.92594: running TaskExecutor() for managed_node1/TASK: Show ipv6_route 22225 1726882772.92662: in run() - task 0affc7ec-ae25-ec05-55b7-000000000062 22225 1726882772.92675: variable 'ansible_search_path' from source: unknown 22225 1726882772.92708: calling self._execute() 22225 1726882772.92789: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882772.92793: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882772.92803: variable 'omit' from source: magic vars 22225 1726882772.93106: variable 'ansible_distribution_major_version' from source: facts 22225 1726882772.93328: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882772.93333: variable 'omit' from source: magic vars 22225 1726882772.93336: variable 'omit' from source: magic vars 22225 1726882772.93338: variable 'omit' from source: magic vars 22225 1726882772.93340: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22225 1726882772.93343: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22225 1726882772.93345: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22225 1726882772.93347: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882772.93349: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882772.93402: variable 'inventory_hostname' from source: host vars for 'managed_node1' 22225 1726882772.93412: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882772.93421: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882772.93545: Set connection var ansible_connection to ssh 22225 1726882772.93561: Set connection var ansible_pipelining to False 22225 1726882772.93573: Set connection var ansible_shell_executable to /bin/sh 22225 1726882772.93583: Set connection var ansible_timeout to 10 22225 1726882772.93594: Set connection var ansible_shell_type to sh 22225 1726882772.93605: Set connection var ansible_module_compression to ZIP_DEFLATED 22225 1726882772.93638: variable 'ansible_shell_executable' from source: unknown 22225 1726882772.93647: variable 'ansible_connection' from source: unknown 22225 1726882772.93654: variable 'ansible_module_compression' from source: unknown 22225 1726882772.93661: variable 'ansible_shell_type' from source: unknown 22225 1726882772.93667: variable 'ansible_shell_executable' from source: unknown 22225 1726882772.93673: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882772.93680: variable 'ansible_pipelining' from source: unknown 22225 1726882772.93686: variable 'ansible_timeout' from source: unknown 22225 1726882772.93696: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882772.93853: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22225 1726882772.93873: variable 'omit' from source: magic vars 22225 1726882772.93886: starting attempt loop 22225 1726882772.93896: running the handler 22225 1726882772.94080: variable 'ipv6_route' from source: set_fact 22225 1726882772.94115: handler run complete 22225 1726882772.94171: attempt loop complete, returning result 22225 1726882772.94174: _execute() done 22225 1726882772.94177: dumping result to json 22225 1726882772.94188: done dumping result, returning 22225 1726882772.94192: done running TaskExecutor() for managed_node1/TASK: Show ipv6_route [0affc7ec-ae25-ec05-55b7-000000000062] 22225 1726882772.94194: sending task result for task 0affc7ec-ae25-ec05-55b7-000000000062 22225 1726882772.94264: done sending task result for task 0affc7ec-ae25-ec05-55b7-000000000062 22225 1726882772.94267: WORKER PROCESS EXITING ok: [managed_node1] => { "ipv6_route.stdout": "2001:db8::/32 dev veth0 proto kernel metric 101 pref medium\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nfe80::/64 dev veth0 proto kernel metric 1024 pref medium\ndefault via 2001:db8::1 dev veth0 proto static metric 101 pref medium" } 22225 1726882772.94315: no more pending results, returning what we have 22225 1726882772.94318: results queue empty 22225 1726882772.94319: checking for any_errors_fatal 22225 1726882772.94331: done checking for any_errors_fatal 22225 1726882772.94332: checking for max_fail_percentage 22225 1726882772.94334: done checking for max_fail_percentage 22225 1726882772.94335: checking to see if all hosts have failed and the running result is not ok 22225 1726882772.94335: done checking to see if all hosts have failed 22225 1726882772.94336: getting the remaining hosts for this loop 22225 1726882772.94338: done getting the remaining hosts for this loop 22225 1726882772.94342: getting the next task for host managed_node1 22225 1726882772.94348: done getting next task for host managed_node1 22225 1726882772.94351: ^ task is: TASK: Assert default ipv6 route is set 22225 1726882772.94353: ^ state is: HOST STATE: block=3, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882772.94356: getting variables 22225 1726882772.94358: in VariableManager get_vars() 22225 1726882772.94397: Calling all_inventory to load vars for managed_node1 22225 1726882772.94400: Calling groups_inventory to load vars for managed_node1 22225 1726882772.94402: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882772.94414: Calling all_plugins_play to load vars for managed_node1 22225 1726882772.94416: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882772.94419: Calling groups_plugins_play to load vars for managed_node1 22225 1726882772.95476: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882772.96630: done with get_vars() 22225 1726882772.96647: done getting variables 22225 1726882772.96692: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert default ipv6 route is set] **************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:76 Friday 20 September 2024 21:39:32 -0400 (0:00:00.045) 0:00:28.359 ****** 22225 1726882772.96714: entering _queue_task() for managed_node1/assert 22225 1726882772.96950: worker is 1 (out of 1 available) 22225 1726882772.96965: exiting _queue_task() for managed_node1/assert 22225 1726882772.96976: done queuing things up, now waiting for results queue to drain 22225 1726882772.96977: waiting for pending results... 22225 1726882772.97160: running TaskExecutor() for managed_node1/TASK: Assert default ipv6 route is set 22225 1726882772.97226: in run() - task 0affc7ec-ae25-ec05-55b7-000000000063 22225 1726882772.97241: variable 'ansible_search_path' from source: unknown 22225 1726882772.97270: calling self._execute() 22225 1726882772.97353: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882772.97359: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882772.97369: variable 'omit' from source: magic vars 22225 1726882772.97662: variable 'ansible_distribution_major_version' from source: facts 22225 1726882772.97673: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882772.97679: variable 'omit' from source: magic vars 22225 1726882772.97698: variable 'omit' from source: magic vars 22225 1726882772.97724: variable 'omit' from source: magic vars 22225 1726882772.97761: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22225 1726882772.97794: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22225 1726882772.97812: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22225 1726882772.97828: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882772.97837: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882772.97866: variable 'inventory_hostname' from source: host vars for 'managed_node1' 22225 1726882772.97869: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882772.97871: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882772.97944: Set connection var ansible_connection to ssh 22225 1726882772.97953: Set connection var ansible_pipelining to False 22225 1726882772.97961: Set connection var ansible_shell_executable to /bin/sh 22225 1726882772.97970: Set connection var ansible_timeout to 10 22225 1726882772.97973: Set connection var ansible_shell_type to sh 22225 1726882772.97975: Set connection var ansible_module_compression to ZIP_DEFLATED 22225 1726882772.97999: variable 'ansible_shell_executable' from source: unknown 22225 1726882772.98002: variable 'ansible_connection' from source: unknown 22225 1726882772.98005: variable 'ansible_module_compression' from source: unknown 22225 1726882772.98008: variable 'ansible_shell_type' from source: unknown 22225 1726882772.98010: variable 'ansible_shell_executable' from source: unknown 22225 1726882772.98013: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882772.98015: variable 'ansible_pipelining' from source: unknown 22225 1726882772.98018: variable 'ansible_timeout' from source: unknown 22225 1726882772.98024: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882772.98137: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22225 1726882772.98147: variable 'omit' from source: magic vars 22225 1726882772.98153: starting attempt loop 22225 1726882772.98156: running the handler 22225 1726882772.98264: variable '__test_str' from source: task vars 22225 1726882772.98319: variable 'interface' from source: play vars 22225 1726882772.98328: variable 'ipv6_route' from source: set_fact 22225 1726882772.98338: Evaluated conditional (__test_str in ipv6_route.stdout): True 22225 1726882772.98344: handler run complete 22225 1726882772.98356: attempt loop complete, returning result 22225 1726882772.98359: _execute() done 22225 1726882772.98362: dumping result to json 22225 1726882772.98365: done dumping result, returning 22225 1726882772.98371: done running TaskExecutor() for managed_node1/TASK: Assert default ipv6 route is set [0affc7ec-ae25-ec05-55b7-000000000063] 22225 1726882772.98376: sending task result for task 0affc7ec-ae25-ec05-55b7-000000000063 22225 1726882772.98467: done sending task result for task 0affc7ec-ae25-ec05-55b7-000000000063 22225 1726882772.98470: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 22225 1726882772.98518: no more pending results, returning what we have 22225 1726882772.98521: results queue empty 22225 1726882772.98524: checking for any_errors_fatal 22225 1726882772.98533: done checking for any_errors_fatal 22225 1726882772.98534: checking for max_fail_percentage 22225 1726882772.98536: done checking for max_fail_percentage 22225 1726882772.98537: checking to see if all hosts have failed and the running result is not ok 22225 1726882772.98538: done checking to see if all hosts have failed 22225 1726882772.98538: getting the remaining hosts for this loop 22225 1726882772.98540: done getting the remaining hosts for this loop 22225 1726882772.98544: getting the next task for host managed_node1 22225 1726882772.98549: done getting next task for host managed_node1 22225 1726882772.98552: ^ task is: TASK: Ensure ping6 command is present 22225 1726882772.98554: ^ state is: HOST STATE: block=3, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882772.98557: getting variables 22225 1726882772.98558: in VariableManager get_vars() 22225 1726882772.98594: Calling all_inventory to load vars for managed_node1 22225 1726882772.98597: Calling groups_inventory to load vars for managed_node1 22225 1726882772.98599: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882772.98610: Calling all_plugins_play to load vars for managed_node1 22225 1726882772.98613: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882772.98616: Calling groups_plugins_play to load vars for managed_node1 22225 1726882772.99562: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882773.00719: done with get_vars() 22225 1726882773.00739: done getting variables 22225 1726882773.00784: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Ensure ping6 command is present] ***************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:81 Friday 20 September 2024 21:39:33 -0400 (0:00:00.040) 0:00:28.400 ****** 22225 1726882773.00810: entering _queue_task() for managed_node1/package 22225 1726882773.01059: worker is 1 (out of 1 available) 22225 1726882773.01076: exiting _queue_task() for managed_node1/package 22225 1726882773.01088: done queuing things up, now waiting for results queue to drain 22225 1726882773.01090: waiting for pending results... 22225 1726882773.01274: running TaskExecutor() for managed_node1/TASK: Ensure ping6 command is present 22225 1726882773.01339: in run() - task 0affc7ec-ae25-ec05-55b7-000000000064 22225 1726882773.01351: variable 'ansible_search_path' from source: unknown 22225 1726882773.01381: calling self._execute() 22225 1726882773.01464: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882773.01469: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882773.01479: variable 'omit' from source: magic vars 22225 1726882773.01779: variable 'ansible_distribution_major_version' from source: facts 22225 1726882773.01792: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882773.01798: variable 'omit' from source: magic vars 22225 1726882773.01815: variable 'omit' from source: magic vars 22225 1726882773.01975: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22225 1726882773.03730: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22225 1726882773.03776: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22225 1726882773.03804: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22225 1726882773.03844: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22225 1726882773.03865: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22225 1726882773.03943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22225 1726882773.03965: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22225 1726882773.03985: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22225 1726882773.04012: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22225 1726882773.04025: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22225 1726882773.04100: variable '__network_is_ostree' from source: set_fact 22225 1726882773.04104: variable 'omit' from source: magic vars 22225 1726882773.04126: variable 'omit' from source: magic vars 22225 1726882773.04151: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22225 1726882773.04172: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22225 1726882773.04186: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22225 1726882773.04200: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882773.04209: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882773.04236: variable 'inventory_hostname' from source: host vars for 'managed_node1' 22225 1726882773.04239: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882773.04244: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882773.04314: Set connection var ansible_connection to ssh 22225 1726882773.04324: Set connection var ansible_pipelining to False 22225 1726882773.04331: Set connection var ansible_shell_executable to /bin/sh 22225 1726882773.04337: Set connection var ansible_timeout to 10 22225 1726882773.04340: Set connection var ansible_shell_type to sh 22225 1726882773.04345: Set connection var ansible_module_compression to ZIP_DEFLATED 22225 1726882773.04369: variable 'ansible_shell_executable' from source: unknown 22225 1726882773.04372: variable 'ansible_connection' from source: unknown 22225 1726882773.04375: variable 'ansible_module_compression' from source: unknown 22225 1726882773.04382: variable 'ansible_shell_type' from source: unknown 22225 1726882773.04385: variable 'ansible_shell_executable' from source: unknown 22225 1726882773.04387: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882773.04389: variable 'ansible_pipelining' from source: unknown 22225 1726882773.04391: variable 'ansible_timeout' from source: unknown 22225 1726882773.04393: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882773.04467: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22225 1726882773.04477: variable 'omit' from source: magic vars 22225 1726882773.04483: starting attempt loop 22225 1726882773.04486: running the handler 22225 1726882773.04492: variable 'ansible_facts' from source: unknown 22225 1726882773.04494: variable 'ansible_facts' from source: unknown 22225 1726882773.04526: _low_level_execute_command(): starting 22225 1726882773.04532: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22225 1726882773.05063: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882773.05068: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882773.05070: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882773.05072: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882773.05135: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882773.05145: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882773.05147: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882773.05197: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882773.06921: stdout chunk (state=3): >>>/root <<< 22225 1726882773.07032: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882773.07089: stderr chunk (state=3): >>><<< 22225 1726882773.07092: stdout chunk (state=3): >>><<< 22225 1726882773.07110: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882773.07121: _low_level_execute_command(): starting 22225 1726882773.07130: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882773.0710938-23245-253521902978498 `" && echo ansible-tmp-1726882773.0710938-23245-253521902978498="` echo /root/.ansible/tmp/ansible-tmp-1726882773.0710938-23245-253521902978498 `" ) && sleep 0' 22225 1726882773.07598: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882773.07602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882773.07604: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration <<< 22225 1726882773.07606: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882773.07610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882773.07651: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882773.07669: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882773.07725: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882773.09710: stdout chunk (state=3): >>>ansible-tmp-1726882773.0710938-23245-253521902978498=/root/.ansible/tmp/ansible-tmp-1726882773.0710938-23245-253521902978498 <<< 22225 1726882773.09827: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882773.09875: stderr chunk (state=3): >>><<< 22225 1726882773.09879: stdout chunk (state=3): >>><<< 22225 1726882773.09894: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882773.0710938-23245-253521902978498=/root/.ansible/tmp/ansible-tmp-1726882773.0710938-23245-253521902978498 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882773.09919: variable 'ansible_module_compression' from source: unknown 22225 1726882773.09965: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-2222502xecypi/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 22225 1726882773.10006: variable 'ansible_facts' from source: unknown 22225 1726882773.10088: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882773.0710938-23245-253521902978498/AnsiballZ_dnf.py 22225 1726882773.10186: Sending initial data 22225 1726882773.10189: Sent initial data (152 bytes) 22225 1726882773.10651: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882773.10654: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found <<< 22225 1726882773.10656: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882773.10659: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882773.10661: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882773.10711: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882773.10715: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882773.10770: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882773.12347: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 22225 1726882773.12351: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22225 1726882773.12401: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22225 1726882773.12458: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2222502xecypi/tmpe_2rgoaj /root/.ansible/tmp/ansible-tmp-1726882773.0710938-23245-253521902978498/AnsiballZ_dnf.py <<< 22225 1726882773.12461: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882773.0710938-23245-253521902978498/AnsiballZ_dnf.py" <<< 22225 1726882773.12512: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-2222502xecypi/tmpe_2rgoaj" to remote "/root/.ansible/tmp/ansible-tmp-1726882773.0710938-23245-253521902978498/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882773.0710938-23245-253521902978498/AnsiballZ_dnf.py" <<< 22225 1726882773.13475: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882773.13609: stderr chunk (state=3): >>><<< 22225 1726882773.13612: stdout chunk (state=3): >>><<< 22225 1726882773.13615: done transferring module to remote 22225 1726882773.13648: _low_level_execute_command(): starting 22225 1726882773.13655: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882773.0710938-23245-253521902978498/ /root/.ansible/tmp/ansible-tmp-1726882773.0710938-23245-253521902978498/AnsiballZ_dnf.py && sleep 0' 22225 1726882773.14120: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882773.14126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 22225 1726882773.14129: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found <<< 22225 1726882773.14131: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882773.14181: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882773.14190: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882773.14246: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882773.16151: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882773.16204: stderr chunk (state=3): >>><<< 22225 1726882773.16213: stdout chunk (state=3): >>><<< 22225 1726882773.16236: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882773.16251: _low_level_execute_command(): starting 22225 1726882773.16260: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882773.0710938-23245-253521902978498/AnsiballZ_dnf.py && sleep 0' 22225 1726882773.16875: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22225 1726882773.16992: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882773.17011: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882773.17032: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882773.17118: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882774.23973: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iputils"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 22225 1726882774.28500: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. <<< 22225 1726882774.28632: stderr chunk (state=3): >>><<< 22225 1726882774.28639: stdout chunk (state=3): >>><<< 22225 1726882774.28689: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iputils"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. 22225 1726882774.28742: done with _execute_module (ansible.legacy.dnf, {'name': 'iputils', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882773.0710938-23245-253521902978498/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22225 1726882774.28760: _low_level_execute_command(): starting 22225 1726882774.28764: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882773.0710938-23245-253521902978498/ > /dev/null 2>&1 && sleep 0' 22225 1726882774.29445: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882774.29449: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882774.29490: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882774.29493: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882774.29543: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882774.31469: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882774.31514: stderr chunk (state=3): >>><<< 22225 1726882774.31517: stdout chunk (state=3): >>><<< 22225 1726882774.31531: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882774.31538: handler run complete 22225 1726882774.31565: attempt loop complete, returning result 22225 1726882774.31568: _execute() done 22225 1726882774.31570: dumping result to json 22225 1726882774.31576: done dumping result, returning 22225 1726882774.31584: done running TaskExecutor() for managed_node1/TASK: Ensure ping6 command is present [0affc7ec-ae25-ec05-55b7-000000000064] 22225 1726882774.31589: sending task result for task 0affc7ec-ae25-ec05-55b7-000000000064 22225 1726882774.31693: done sending task result for task 0affc7ec-ae25-ec05-55b7-000000000064 22225 1726882774.31696: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 22225 1726882774.31771: no more pending results, returning what we have 22225 1726882774.31774: results queue empty 22225 1726882774.31775: checking for any_errors_fatal 22225 1726882774.31784: done checking for any_errors_fatal 22225 1726882774.31785: checking for max_fail_percentage 22225 1726882774.31787: done checking for max_fail_percentage 22225 1726882774.31788: checking to see if all hosts have failed and the running result is not ok 22225 1726882774.31789: done checking to see if all hosts have failed 22225 1726882774.31790: getting the remaining hosts for this loop 22225 1726882774.31791: done getting the remaining hosts for this loop 22225 1726882774.31795: getting the next task for host managed_node1 22225 1726882774.31801: done getting next task for host managed_node1 22225 1726882774.31804: ^ task is: TASK: Test gateway can be pinged 22225 1726882774.31806: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882774.31809: getting variables 22225 1726882774.31810: in VariableManager get_vars() 22225 1726882774.31852: Calling all_inventory to load vars for managed_node1 22225 1726882774.31854: Calling groups_inventory to load vars for managed_node1 22225 1726882774.31856: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882774.31867: Calling all_plugins_play to load vars for managed_node1 22225 1726882774.31870: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882774.31873: Calling groups_plugins_play to load vars for managed_node1 22225 1726882774.33027: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882774.35366: done with get_vars() 22225 1726882774.35395: done getting variables 22225 1726882774.35461: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Test gateway can be pinged] ********************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:86 Friday 20 September 2024 21:39:34 -0400 (0:00:01.346) 0:00:29.747 ****** 22225 1726882774.35493: entering _queue_task() for managed_node1/command 22225 1726882774.36053: worker is 1 (out of 1 available) 22225 1726882774.36063: exiting _queue_task() for managed_node1/command 22225 1726882774.36073: done queuing things up, now waiting for results queue to drain 22225 1726882774.36074: waiting for pending results... 22225 1726882774.36311: running TaskExecutor() for managed_node1/TASK: Test gateway can be pinged 22225 1726882774.36316: in run() - task 0affc7ec-ae25-ec05-55b7-000000000065 22225 1726882774.36319: variable 'ansible_search_path' from source: unknown 22225 1726882774.36332: calling self._execute() 22225 1726882774.36455: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882774.36468: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882774.36486: variable 'omit' from source: magic vars 22225 1726882774.36903: variable 'ansible_distribution_major_version' from source: facts 22225 1726882774.36919: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882774.36932: variable 'omit' from source: magic vars 22225 1726882774.36960: variable 'omit' from source: magic vars 22225 1726882774.37003: variable 'omit' from source: magic vars 22225 1726882774.37057: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22225 1726882774.37116: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22225 1726882774.37147: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22225 1726882774.37177: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882774.37285: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882774.37289: variable 'inventory_hostname' from source: host vars for 'managed_node1' 22225 1726882774.37292: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882774.37294: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882774.37374: Set connection var ansible_connection to ssh 22225 1726882774.37419: Set connection var ansible_pipelining to False 22225 1726882774.37436: Set connection var ansible_shell_executable to /bin/sh 22225 1726882774.37461: Set connection var ansible_timeout to 10 22225 1726882774.37473: Set connection var ansible_shell_type to sh 22225 1726882774.37502: Set connection var ansible_module_compression to ZIP_DEFLATED 22225 1726882774.37537: variable 'ansible_shell_executable' from source: unknown 22225 1726882774.37553: variable 'ansible_connection' from source: unknown 22225 1726882774.37561: variable 'ansible_module_compression' from source: unknown 22225 1726882774.37569: variable 'ansible_shell_type' from source: unknown 22225 1726882774.37576: variable 'ansible_shell_executable' from source: unknown 22225 1726882774.37586: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882774.37716: variable 'ansible_pipelining' from source: unknown 22225 1726882774.37720: variable 'ansible_timeout' from source: unknown 22225 1726882774.37725: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882774.37872: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22225 1726882774.37944: variable 'omit' from source: magic vars 22225 1726882774.37956: starting attempt loop 22225 1726882774.37962: running the handler 22225 1726882774.37983: _low_level_execute_command(): starting 22225 1726882774.38007: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22225 1726882774.39143: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882774.39317: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882774.39454: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882774.39651: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882774.41212: stdout chunk (state=3): >>>/root <<< 22225 1726882774.41415: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882774.41419: stdout chunk (state=3): >>><<< 22225 1726882774.41428: stderr chunk (state=3): >>><<< 22225 1726882774.41561: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882774.41565: _low_level_execute_command(): starting 22225 1726882774.41568: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882774.4145193-23296-53974743839748 `" && echo ansible-tmp-1726882774.4145193-23296-53974743839748="` echo /root/.ansible/tmp/ansible-tmp-1726882774.4145193-23296-53974743839748 `" ) && sleep 0' 22225 1726882774.43454: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882774.43550: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882774.43576: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882774.43607: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882774.43727: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882774.45678: stdout chunk (state=3): >>>ansible-tmp-1726882774.4145193-23296-53974743839748=/root/.ansible/tmp/ansible-tmp-1726882774.4145193-23296-53974743839748 <<< 22225 1726882774.45974: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882774.45977: stderr chunk (state=3): >>><<< 22225 1726882774.45979: stdout chunk (state=3): >>><<< 22225 1726882774.45982: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882774.4145193-23296-53974743839748=/root/.ansible/tmp/ansible-tmp-1726882774.4145193-23296-53974743839748 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882774.45984: variable 'ansible_module_compression' from source: unknown 22225 1726882774.46003: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-2222502xecypi/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 22225 1726882774.46043: variable 'ansible_facts' from source: unknown 22225 1726882774.46125: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882774.4145193-23296-53974743839748/AnsiballZ_command.py 22225 1726882774.46345: Sending initial data 22225 1726882774.46349: Sent initial data (155 bytes) 22225 1726882774.46926: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22225 1726882774.47028: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882774.47040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882774.47088: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882774.47101: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 <<< 22225 1726882774.47268: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882774.47442: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882774.47518: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882774.49096: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22225 1726882774.49164: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22225 1726882774.49240: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2222502xecypi/tmpt49ylu04 /root/.ansible/tmp/ansible-tmp-1726882774.4145193-23296-53974743839748/AnsiballZ_command.py <<< 22225 1726882774.49256: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882774.4145193-23296-53974743839748/AnsiballZ_command.py" <<< 22225 1726882774.49329: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-2222502xecypi/tmpt49ylu04" to remote "/root/.ansible/tmp/ansible-tmp-1726882774.4145193-23296-53974743839748/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882774.4145193-23296-53974743839748/AnsiballZ_command.py" <<< 22225 1726882774.50188: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882774.50244: stderr chunk (state=3): >>><<< 22225 1726882774.50254: stdout chunk (state=3): >>><<< 22225 1726882774.50311: done transferring module to remote 22225 1726882774.50443: _low_level_execute_command(): starting 22225 1726882774.50462: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882774.4145193-23296-53974743839748/ /root/.ansible/tmp/ansible-tmp-1726882774.4145193-23296-53974743839748/AnsiballZ_command.py && sleep 0' 22225 1726882774.51676: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882774.51683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882774.51686: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882774.51689: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found <<< 22225 1726882774.51691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882774.51839: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882774.52049: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882774.52052: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882774.52155: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882774.54026: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882774.54045: stdout chunk (state=3): >>><<< 22225 1726882774.54062: stderr chunk (state=3): >>><<< 22225 1726882774.54087: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882774.54186: _low_level_execute_command(): starting 22225 1726882774.54190: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882774.4145193-23296-53974743839748/AnsiballZ_command.py && sleep 0' 22225 1726882774.55360: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22225 1726882774.55572: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882774.55655: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882774.55754: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882774.72545: stdout chunk (state=3): >>> {"changed": true, "stdout": "PING 2001:db8::1 (2001:db8::1) 56 data bytes\n64 bytes from 2001:db8::1: icmp_seq=1 ttl=64 time=0.052 ms\n\n--- 2001:db8::1 ping statistics ---\n1 packets transmitted, 1 received, 0% packet loss, time 0ms\nrtt min/avg/max/mdev = 0.052/0.052/0.052/0.000 ms", "stderr": "", "rc": 0, "cmd": ["ping6", "-c1", "2001:db8::1"], "start": "2024-09-20 21:39:34.719471", "end": "2024-09-20 21:39:34.723667", "delta": "0:00:00.004196", "msg": "", "invocation": {"module_args": {"_raw_params": "ping6 -c1 2001:db8::1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 22225 1726882774.74160: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. <<< 22225 1726882774.74176: stdout chunk (state=3): >>><<< 22225 1726882774.74193: stderr chunk (state=3): >>><<< 22225 1726882774.74329: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "PING 2001:db8::1 (2001:db8::1) 56 data bytes\n64 bytes from 2001:db8::1: icmp_seq=1 ttl=64 time=0.052 ms\n\n--- 2001:db8::1 ping statistics ---\n1 packets transmitted, 1 received, 0% packet loss, time 0ms\nrtt min/avg/max/mdev = 0.052/0.052/0.052/0.000 ms", "stderr": "", "rc": 0, "cmd": ["ping6", "-c1", "2001:db8::1"], "start": "2024-09-20 21:39:34.719471", "end": "2024-09-20 21:39:34.723667", "delta": "0:00:00.004196", "msg": "", "invocation": {"module_args": {"_raw_params": "ping6 -c1 2001:db8::1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. 22225 1726882774.74333: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ping6 -c1 2001:db8::1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882774.4145193-23296-53974743839748/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22225 1726882774.74335: _low_level_execute_command(): starting 22225 1726882774.74338: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882774.4145193-23296-53974743839748/ > /dev/null 2>&1 && sleep 0' 22225 1726882774.75019: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22225 1726882774.75034: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882774.75124: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882774.75167: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882774.75189: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882774.75242: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882774.75288: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882774.77354: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882774.77358: stdout chunk (state=3): >>><<< 22225 1726882774.77361: stderr chunk (state=3): >>><<< 22225 1726882774.77542: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882774.77551: handler run complete 22225 1726882774.77554: Evaluated conditional (False): False 22225 1726882774.77556: attempt loop complete, returning result 22225 1726882774.77559: _execute() done 22225 1726882774.77561: dumping result to json 22225 1726882774.77563: done dumping result, returning 22225 1726882774.77565: done running TaskExecutor() for managed_node1/TASK: Test gateway can be pinged [0affc7ec-ae25-ec05-55b7-000000000065] 22225 1726882774.77568: sending task result for task 0affc7ec-ae25-ec05-55b7-000000000065 22225 1726882774.78032: done sending task result for task 0affc7ec-ae25-ec05-55b7-000000000065 22225 1726882774.78042: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "ping6", "-c1", "2001:db8::1" ], "delta": "0:00:00.004196", "end": "2024-09-20 21:39:34.723667", "rc": 0, "start": "2024-09-20 21:39:34.719471" } STDOUT: PING 2001:db8::1 (2001:db8::1) 56 data bytes 64 bytes from 2001:db8::1: icmp_seq=1 ttl=64 time=0.052 ms --- 2001:db8::1 ping statistics --- 1 packets transmitted, 1 received, 0% packet loss, time 0ms rtt min/avg/max/mdev = 0.052/0.052/0.052/0.000 ms 22225 1726882774.78184: no more pending results, returning what we have 22225 1726882774.78189: results queue empty 22225 1726882774.78190: checking for any_errors_fatal 22225 1726882774.78200: done checking for any_errors_fatal 22225 1726882774.78201: checking for max_fail_percentage 22225 1726882774.78204: done checking for max_fail_percentage 22225 1726882774.78205: checking to see if all hosts have failed and the running result is not ok 22225 1726882774.78206: done checking to see if all hosts have failed 22225 1726882774.78206: getting the remaining hosts for this loop 22225 1726882774.78208: done getting the remaining hosts for this loop 22225 1726882774.78213: getting the next task for host managed_node1 22225 1726882774.78224: done getting next task for host managed_node1 22225 1726882774.78227: ^ task is: TASK: TEARDOWN: remove profiles. 22225 1726882774.78230: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882774.78233: getting variables 22225 1726882774.78235: in VariableManager get_vars() 22225 1726882774.78282: Calling all_inventory to load vars for managed_node1 22225 1726882774.78286: Calling groups_inventory to load vars for managed_node1 22225 1726882774.78288: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882774.78301: Calling all_plugins_play to load vars for managed_node1 22225 1726882774.78305: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882774.78313: Calling groups_plugins_play to load vars for managed_node1 22225 1726882774.81148: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882774.84715: done with get_vars() 22225 1726882774.84778: done getting variables 22225 1726882774.84947: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [TEARDOWN: remove profiles.] ********************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:92 Friday 20 September 2024 21:39:34 -0400 (0:00:00.495) 0:00:30.243 ****** 22225 1726882774.85118: entering _queue_task() for managed_node1/debug 22225 1726882774.85631: worker is 1 (out of 1 available) 22225 1726882774.85647: exiting _queue_task() for managed_node1/debug 22225 1726882774.85659: done queuing things up, now waiting for results queue to drain 22225 1726882774.85661: waiting for pending results... 22225 1726882774.86799: running TaskExecutor() for managed_node1/TASK: TEARDOWN: remove profiles. 22225 1726882774.87065: in run() - task 0affc7ec-ae25-ec05-55b7-000000000066 22225 1726882774.87107: variable 'ansible_search_path' from source: unknown 22225 1726882774.87243: calling self._execute() 22225 1726882774.87589: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882774.87597: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882774.87600: variable 'omit' from source: magic vars 22225 1726882774.88028: variable 'ansible_distribution_major_version' from source: facts 22225 1726882774.88051: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882774.88143: variable 'omit' from source: magic vars 22225 1726882774.88147: variable 'omit' from source: magic vars 22225 1726882774.88167: variable 'omit' from source: magic vars 22225 1726882774.88218: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22225 1726882774.88274: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22225 1726882774.88304: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22225 1726882774.88330: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882774.88352: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882774.88395: variable 'inventory_hostname' from source: host vars for 'managed_node1' 22225 1726882774.88457: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882774.88460: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882774.88534: Set connection var ansible_connection to ssh 22225 1726882774.88550: Set connection var ansible_pipelining to False 22225 1726882774.88567: Set connection var ansible_shell_executable to /bin/sh 22225 1726882774.88578: Set connection var ansible_timeout to 10 22225 1726882774.88588: Set connection var ansible_shell_type to sh 22225 1726882774.88646: Set connection var ansible_module_compression to ZIP_DEFLATED 22225 1726882774.88715: variable 'ansible_shell_executable' from source: unknown 22225 1726882774.88750: variable 'ansible_connection' from source: unknown 22225 1726882774.88901: variable 'ansible_module_compression' from source: unknown 22225 1726882774.88904: variable 'ansible_shell_type' from source: unknown 22225 1726882774.88907: variable 'ansible_shell_executable' from source: unknown 22225 1726882774.88909: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882774.88911: variable 'ansible_pipelining' from source: unknown 22225 1726882774.88914: variable 'ansible_timeout' from source: unknown 22225 1726882774.88916: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882774.89294: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22225 1726882774.89343: variable 'omit' from source: magic vars 22225 1726882774.89347: starting attempt loop 22225 1726882774.89550: running the handler 22225 1726882774.89554: handler run complete 22225 1726882774.89557: attempt loop complete, returning result 22225 1726882774.89559: _execute() done 22225 1726882774.89561: dumping result to json 22225 1726882774.89564: done dumping result, returning 22225 1726882774.89566: done running TaskExecutor() for managed_node1/TASK: TEARDOWN: remove profiles. [0affc7ec-ae25-ec05-55b7-000000000066] 22225 1726882774.89568: sending task result for task 0affc7ec-ae25-ec05-55b7-000000000066 22225 1726882774.89654: done sending task result for task 0affc7ec-ae25-ec05-55b7-000000000066 22225 1726882774.89658: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: ################################################## 22225 1726882774.89733: no more pending results, returning what we have 22225 1726882774.89738: results queue empty 22225 1726882774.89738: checking for any_errors_fatal 22225 1726882774.89749: done checking for any_errors_fatal 22225 1726882774.89750: checking for max_fail_percentage 22225 1726882774.89752: done checking for max_fail_percentage 22225 1726882774.89753: checking to see if all hosts have failed and the running result is not ok 22225 1726882774.89754: done checking to see if all hosts have failed 22225 1726882774.89755: getting the remaining hosts for this loop 22225 1726882774.89756: done getting the remaining hosts for this loop 22225 1726882774.89761: getting the next task for host managed_node1 22225 1726882774.89774: done getting next task for host managed_node1 22225 1726882774.89787: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 22225 1726882774.89791: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22225 1726882774.89814: getting variables 22225 1726882774.89816: in VariableManager get_vars() 22225 1726882774.89865: Calling all_inventory to load vars for managed_node1 22225 1726882774.89868: Calling groups_inventory to load vars for managed_node1 22225 1726882774.89870: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882774.89892: Calling all_plugins_play to load vars for managed_node1 22225 1726882774.89895: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882774.89899: Calling groups_plugins_play to load vars for managed_node1 22225 1726882774.93067: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882774.95938: done with get_vars() 22225 1726882774.95979: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:39:34 -0400 (0:00:00.111) 0:00:30.355 ****** 22225 1726882774.96257: entering _queue_task() for managed_node1/include_tasks 22225 1726882774.96737: worker is 1 (out of 1 available) 22225 1726882774.96754: exiting _queue_task() for managed_node1/include_tasks 22225 1726882774.96767: done queuing things up, now waiting for results queue to drain 22225 1726882774.96768: waiting for pending results... 22225 1726882774.97340: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 22225 1726882774.97346: in run() - task 0affc7ec-ae25-ec05-55b7-00000000006e 22225 1726882774.97350: variable 'ansible_search_path' from source: unknown 22225 1726882774.97353: variable 'ansible_search_path' from source: unknown 22225 1726882774.97356: calling self._execute() 22225 1726882774.97433: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882774.97450: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882774.97475: variable 'omit' from source: magic vars 22225 1726882774.97899: variable 'ansible_distribution_major_version' from source: facts 22225 1726882774.97918: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882774.97936: _execute() done 22225 1726882774.97947: dumping result to json 22225 1726882774.97956: done dumping result, returning 22225 1726882774.97967: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affc7ec-ae25-ec05-55b7-00000000006e] 22225 1726882774.97977: sending task result for task 0affc7ec-ae25-ec05-55b7-00000000006e 22225 1726882774.98152: no more pending results, returning what we have 22225 1726882774.98157: in VariableManager get_vars() 22225 1726882774.98209: Calling all_inventory to load vars for managed_node1 22225 1726882774.98212: Calling groups_inventory to load vars for managed_node1 22225 1726882774.98214: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882774.98230: Calling all_plugins_play to load vars for managed_node1 22225 1726882774.98234: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882774.98237: Calling groups_plugins_play to load vars for managed_node1 22225 1726882774.98925: done sending task result for task 0affc7ec-ae25-ec05-55b7-00000000006e 22225 1726882774.98929: WORKER PROCESS EXITING 22225 1726882775.00166: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882775.02300: done with get_vars() 22225 1726882775.02326: variable 'ansible_search_path' from source: unknown 22225 1726882775.02327: variable 'ansible_search_path' from source: unknown 22225 1726882775.02371: we have included files to process 22225 1726882775.02372: generating all_blocks data 22225 1726882775.02375: done generating all_blocks data 22225 1726882775.02383: processing included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 22225 1726882775.02384: loading included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 22225 1726882775.02387: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 22225 1726882775.03012: done processing included file 22225 1726882775.03014: iterating over new_blocks loaded from include file 22225 1726882775.03016: in VariableManager get_vars() 22225 1726882775.03046: done with get_vars() 22225 1726882775.03048: filtering new block on tags 22225 1726882775.03068: done filtering new block on tags 22225 1726882775.03071: in VariableManager get_vars() 22225 1726882775.03099: done with get_vars() 22225 1726882775.03101: filtering new block on tags 22225 1726882775.03127: done filtering new block on tags 22225 1726882775.03130: in VariableManager get_vars() 22225 1726882775.03155: done with get_vars() 22225 1726882775.03157: filtering new block on tags 22225 1726882775.03177: done filtering new block on tags 22225 1726882775.03182: done iterating over new_blocks loaded from include file included: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node1 22225 1726882775.03188: extending task lists for all hosts with included blocks 22225 1726882775.04142: done extending task lists 22225 1726882775.04144: done processing included files 22225 1726882775.04144: results queue empty 22225 1726882775.04145: checking for any_errors_fatal 22225 1726882775.04149: done checking for any_errors_fatal 22225 1726882775.04150: checking for max_fail_percentage 22225 1726882775.04151: done checking for max_fail_percentage 22225 1726882775.04152: checking to see if all hosts have failed and the running result is not ok 22225 1726882775.04153: done checking to see if all hosts have failed 22225 1726882775.04153: getting the remaining hosts for this loop 22225 1726882775.04155: done getting the remaining hosts for this loop 22225 1726882775.04157: getting the next task for host managed_node1 22225 1726882775.04162: done getting next task for host managed_node1 22225 1726882775.04165: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 22225 1726882775.04169: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22225 1726882775.04178: getting variables 22225 1726882775.04182: in VariableManager get_vars() 22225 1726882775.04199: Calling all_inventory to load vars for managed_node1 22225 1726882775.04202: Calling groups_inventory to load vars for managed_node1 22225 1726882775.04204: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882775.04210: Calling all_plugins_play to load vars for managed_node1 22225 1726882775.04213: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882775.04216: Calling groups_plugins_play to load vars for managed_node1 22225 1726882775.10400: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882775.12534: done with get_vars() 22225 1726882775.12566: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:39:35 -0400 (0:00:00.164) 0:00:30.519 ****** 22225 1726882775.12659: entering _queue_task() for managed_node1/setup 22225 1726882775.13054: worker is 1 (out of 1 available) 22225 1726882775.13069: exiting _queue_task() for managed_node1/setup 22225 1726882775.13085: done queuing things up, now waiting for results queue to drain 22225 1726882775.13087: waiting for pending results... 22225 1726882775.13409: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 22225 1726882775.13609: in run() - task 0affc7ec-ae25-ec05-55b7-000000000513 22225 1726882775.13636: variable 'ansible_search_path' from source: unknown 22225 1726882775.13646: variable 'ansible_search_path' from source: unknown 22225 1726882775.13697: calling self._execute() 22225 1726882775.13812: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882775.13828: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882775.13845: variable 'omit' from source: magic vars 22225 1726882775.14283: variable 'ansible_distribution_major_version' from source: facts 22225 1726882775.14303: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882775.14558: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22225 1726882775.16975: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22225 1726882775.17067: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22225 1726882775.17118: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22225 1726882775.17162: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22225 1726882775.17197: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22225 1726882775.17294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22225 1726882775.17336: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22225 1726882775.17369: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22225 1726882775.17421: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22225 1726882775.17448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22225 1726882775.17512: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22225 1726882775.17548: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22225 1726882775.17578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22225 1726882775.17630: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22225 1726882775.17656: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22225 1726882775.17865: variable '__network_required_facts' from source: role '' defaults 22225 1726882775.17869: variable 'ansible_facts' from source: unknown 22225 1726882775.18816: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 22225 1726882775.18829: when evaluation is False, skipping this task 22225 1726882775.18839: _execute() done 22225 1726882775.18882: dumping result to json 22225 1726882775.18886: done dumping result, returning 22225 1726882775.18888: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affc7ec-ae25-ec05-55b7-000000000513] 22225 1726882775.18957: sending task result for task 0affc7ec-ae25-ec05-55b7-000000000513 skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 22225 1726882775.19125: no more pending results, returning what we have 22225 1726882775.19130: results queue empty 22225 1726882775.19132: checking for any_errors_fatal 22225 1726882775.19134: done checking for any_errors_fatal 22225 1726882775.19135: checking for max_fail_percentage 22225 1726882775.19137: done checking for max_fail_percentage 22225 1726882775.19139: checking to see if all hosts have failed and the running result is not ok 22225 1726882775.19140: done checking to see if all hosts have failed 22225 1726882775.19141: getting the remaining hosts for this loop 22225 1726882775.19143: done getting the remaining hosts for this loop 22225 1726882775.19148: getting the next task for host managed_node1 22225 1726882775.19159: done getting next task for host managed_node1 22225 1726882775.19164: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 22225 1726882775.19169: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22225 1726882775.19193: getting variables 22225 1726882775.19195: in VariableManager get_vars() 22225 1726882775.19248: Calling all_inventory to load vars for managed_node1 22225 1726882775.19251: Calling groups_inventory to load vars for managed_node1 22225 1726882775.19254: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882775.19266: Calling all_plugins_play to load vars for managed_node1 22225 1726882775.19269: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882775.19273: Calling groups_plugins_play to load vars for managed_node1 22225 1726882775.19842: done sending task result for task 0affc7ec-ae25-ec05-55b7-000000000513 22225 1726882775.19846: WORKER PROCESS EXITING 22225 1726882775.21324: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882775.23421: done with get_vars() 22225 1726882775.23454: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:39:35 -0400 (0:00:00.109) 0:00:30.628 ****** 22225 1726882775.23584: entering _queue_task() for managed_node1/stat 22225 1726882775.24060: worker is 1 (out of 1 available) 22225 1726882775.24073: exiting _queue_task() for managed_node1/stat 22225 1726882775.24086: done queuing things up, now waiting for results queue to drain 22225 1726882775.24087: waiting for pending results... 22225 1726882775.24310: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 22225 1726882775.24503: in run() - task 0affc7ec-ae25-ec05-55b7-000000000515 22225 1726882775.24525: variable 'ansible_search_path' from source: unknown 22225 1726882775.24534: variable 'ansible_search_path' from source: unknown 22225 1726882775.24582: calling self._execute() 22225 1726882775.24694: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882775.24705: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882775.24718: variable 'omit' from source: magic vars 22225 1726882775.25146: variable 'ansible_distribution_major_version' from source: facts 22225 1726882775.25166: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882775.25364: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22225 1726882775.25673: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22225 1726882775.25740: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22225 1726882775.26428: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22225 1726882775.26432: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22225 1726882775.26436: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22225 1726882775.26439: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22225 1726882775.26442: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22225 1726882775.26444: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22225 1726882775.26529: variable '__network_is_ostree' from source: set_fact 22225 1726882775.26544: Evaluated conditional (not __network_is_ostree is defined): False 22225 1726882775.26552: when evaluation is False, skipping this task 22225 1726882775.26564: _execute() done 22225 1726882775.26571: dumping result to json 22225 1726882775.26671: done dumping result, returning 22225 1726882775.26675: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affc7ec-ae25-ec05-55b7-000000000515] 22225 1726882775.26678: sending task result for task 0affc7ec-ae25-ec05-55b7-000000000515 22225 1726882775.26767: done sending task result for task 0affc7ec-ae25-ec05-55b7-000000000515 22225 1726882775.26771: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 22225 1726882775.27077: no more pending results, returning what we have 22225 1726882775.27084: results queue empty 22225 1726882775.27085: checking for any_errors_fatal 22225 1726882775.27091: done checking for any_errors_fatal 22225 1726882775.27092: checking for max_fail_percentage 22225 1726882775.27094: done checking for max_fail_percentage 22225 1726882775.27095: checking to see if all hosts have failed and the running result is not ok 22225 1726882775.27097: done checking to see if all hosts have failed 22225 1726882775.27097: getting the remaining hosts for this loop 22225 1726882775.27099: done getting the remaining hosts for this loop 22225 1726882775.27104: getting the next task for host managed_node1 22225 1726882775.27112: done getting next task for host managed_node1 22225 1726882775.27116: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 22225 1726882775.27121: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22225 1726882775.27143: getting variables 22225 1726882775.27145: in VariableManager get_vars() 22225 1726882775.27194: Calling all_inventory to load vars for managed_node1 22225 1726882775.27197: Calling groups_inventory to load vars for managed_node1 22225 1726882775.27200: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882775.27213: Calling all_plugins_play to load vars for managed_node1 22225 1726882775.27216: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882775.27219: Calling groups_plugins_play to load vars for managed_node1 22225 1726882775.29224: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882775.31458: done with get_vars() 22225 1726882775.31494: done getting variables 22225 1726882775.31564: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:39:35 -0400 (0:00:00.080) 0:00:30.708 ****** 22225 1726882775.31609: entering _queue_task() for managed_node1/set_fact 22225 1726882775.32004: worker is 1 (out of 1 available) 22225 1726882775.32017: exiting _queue_task() for managed_node1/set_fact 22225 1726882775.32034: done queuing things up, now waiting for results queue to drain 22225 1726882775.32036: waiting for pending results... 22225 1726882775.32345: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 22225 1726882775.32535: in run() - task 0affc7ec-ae25-ec05-55b7-000000000516 22225 1726882775.32556: variable 'ansible_search_path' from source: unknown 22225 1726882775.32565: variable 'ansible_search_path' from source: unknown 22225 1726882775.32613: calling self._execute() 22225 1726882775.32731: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882775.32744: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882775.32759: variable 'omit' from source: magic vars 22225 1726882775.33178: variable 'ansible_distribution_major_version' from source: facts 22225 1726882775.33198: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882775.33440: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22225 1726882775.33694: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22225 1726882775.33745: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22225 1726882775.33833: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22225 1726882775.33883: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22225 1726882775.33989: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22225 1726882775.34021: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22225 1726882775.34090: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22225 1726882775.34093: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22225 1726882775.34201: variable '__network_is_ostree' from source: set_fact 22225 1726882775.34213: Evaluated conditional (not __network_is_ostree is defined): False 22225 1726882775.34221: when evaluation is False, skipping this task 22225 1726882775.34229: _execute() done 22225 1726882775.34237: dumping result to json 22225 1726882775.34306: done dumping result, returning 22225 1726882775.34310: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affc7ec-ae25-ec05-55b7-000000000516] 22225 1726882775.34313: sending task result for task 0affc7ec-ae25-ec05-55b7-000000000516 22225 1726882775.34395: done sending task result for task 0affc7ec-ae25-ec05-55b7-000000000516 22225 1726882775.34398: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 22225 1726882775.34463: no more pending results, returning what we have 22225 1726882775.34468: results queue empty 22225 1726882775.34469: checking for any_errors_fatal 22225 1726882775.34476: done checking for any_errors_fatal 22225 1726882775.34477: checking for max_fail_percentage 22225 1726882775.34482: done checking for max_fail_percentage 22225 1726882775.34483: checking to see if all hosts have failed and the running result is not ok 22225 1726882775.34484: done checking to see if all hosts have failed 22225 1726882775.34485: getting the remaining hosts for this loop 22225 1726882775.34487: done getting the remaining hosts for this loop 22225 1726882775.34492: getting the next task for host managed_node1 22225 1726882775.34504: done getting next task for host managed_node1 22225 1726882775.34507: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 22225 1726882775.34512: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22225 1726882775.34534: getting variables 22225 1726882775.34536: in VariableManager get_vars() 22225 1726882775.34586: Calling all_inventory to load vars for managed_node1 22225 1726882775.34589: Calling groups_inventory to load vars for managed_node1 22225 1726882775.34591: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882775.34604: Calling all_plugins_play to load vars for managed_node1 22225 1726882775.34607: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882775.34610: Calling groups_plugins_play to load vars for managed_node1 22225 1726882775.36761: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882775.38976: done with get_vars() 22225 1726882775.39006: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:39:35 -0400 (0:00:00.075) 0:00:30.783 ****** 22225 1726882775.39117: entering _queue_task() for managed_node1/service_facts 22225 1726882775.41788: worker is 1 (out of 1 available) 22225 1726882775.41803: exiting _queue_task() for managed_node1/service_facts 22225 1726882775.41816: done queuing things up, now waiting for results queue to drain 22225 1726882775.41818: waiting for pending results... 22225 1726882775.42245: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running 22225 1726882775.42382: in run() - task 0affc7ec-ae25-ec05-55b7-000000000518 22225 1726882775.42407: variable 'ansible_search_path' from source: unknown 22225 1726882775.42415: variable 'ansible_search_path' from source: unknown 22225 1726882775.42478: calling self._execute() 22225 1726882775.42694: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882775.42701: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882775.42704: variable 'omit' from source: magic vars 22225 1726882775.43078: variable 'ansible_distribution_major_version' from source: facts 22225 1726882775.43099: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882775.43111: variable 'omit' from source: magic vars 22225 1726882775.43211: variable 'omit' from source: magic vars 22225 1726882775.43328: variable 'omit' from source: magic vars 22225 1726882775.43332: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22225 1726882775.43375: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22225 1726882775.43404: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22225 1726882775.43430: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882775.43446: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882775.43495: variable 'inventory_hostname' from source: host vars for 'managed_node1' 22225 1726882775.43503: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882775.43511: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882775.43634: Set connection var ansible_connection to ssh 22225 1726882775.43650: Set connection var ansible_pipelining to False 22225 1726882775.43663: Set connection var ansible_shell_executable to /bin/sh 22225 1726882775.43695: Set connection var ansible_timeout to 10 22225 1726882775.43698: Set connection var ansible_shell_type to sh 22225 1726882775.43726: Set connection var ansible_module_compression to ZIP_DEFLATED 22225 1726882775.43739: variable 'ansible_shell_executable' from source: unknown 22225 1726882775.43746: variable 'ansible_connection' from source: unknown 22225 1726882775.43754: variable 'ansible_module_compression' from source: unknown 22225 1726882775.43761: variable 'ansible_shell_type' from source: unknown 22225 1726882775.43767: variable 'ansible_shell_executable' from source: unknown 22225 1726882775.43806: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882775.43809: variable 'ansible_pipelining' from source: unknown 22225 1726882775.43812: variable 'ansible_timeout' from source: unknown 22225 1726882775.43889: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882775.44070: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 22225 1726882775.44092: variable 'omit' from source: magic vars 22225 1726882775.44102: starting attempt loop 22225 1726882775.44118: running the handler 22225 1726882775.44143: _low_level_execute_command(): starting 22225 1726882775.44154: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22225 1726882775.45019: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration <<< 22225 1726882775.45110: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882775.45147: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882775.45200: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882775.45265: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882775.47256: stdout chunk (state=3): >>>/root <<< 22225 1726882775.47358: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882775.47365: stdout chunk (state=3): >>><<< 22225 1726882775.47368: stderr chunk (state=3): >>><<< 22225 1726882775.47372: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882775.47374: _low_level_execute_command(): starting 22225 1726882775.47377: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882775.4727578-23336-66785066654121 `" && echo ansible-tmp-1726882775.4727578-23336-66785066654121="` echo /root/.ansible/tmp/ansible-tmp-1726882775.4727578-23336-66785066654121 `" ) && sleep 0' 22225 1726882775.47928: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882775.47932: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found <<< 22225 1726882775.47934: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration <<< 22225 1726882775.47944: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882775.47947: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882775.48000: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882775.48003: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882775.48038: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882775.48161: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882775.50127: stdout chunk (state=3): >>>ansible-tmp-1726882775.4727578-23336-66785066654121=/root/.ansible/tmp/ansible-tmp-1726882775.4727578-23336-66785066654121 <<< 22225 1726882775.50331: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882775.50334: stdout chunk (state=3): >>><<< 22225 1726882775.50337: stderr chunk (state=3): >>><<< 22225 1726882775.50528: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882775.4727578-23336-66785066654121=/root/.ansible/tmp/ansible-tmp-1726882775.4727578-23336-66785066654121 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882775.50532: variable 'ansible_module_compression' from source: unknown 22225 1726882775.50534: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-2222502xecypi/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 22225 1726882775.50536: variable 'ansible_facts' from source: unknown 22225 1726882775.50593: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882775.4727578-23336-66785066654121/AnsiballZ_service_facts.py 22225 1726882775.50807: Sending initial data 22225 1726882775.50818: Sent initial data (161 bytes) 22225 1726882775.51408: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22225 1726882775.51412: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882775.51440: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882775.51453: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 22225 1726882775.51537: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882775.51566: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882775.51644: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882775.53250: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22225 1726882775.53319: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22225 1726882775.53400: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2222502xecypi/tmpf2qz79a0 /root/.ansible/tmp/ansible-tmp-1726882775.4727578-23336-66785066654121/AnsiballZ_service_facts.py <<< 22225 1726882775.53424: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882775.4727578-23336-66785066654121/AnsiballZ_service_facts.py" <<< 22225 1726882775.53459: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-2222502xecypi/tmpf2qz79a0" to remote "/root/.ansible/tmp/ansible-tmp-1726882775.4727578-23336-66785066654121/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882775.4727578-23336-66785066654121/AnsiballZ_service_facts.py" <<< 22225 1726882775.54434: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882775.54438: stdout chunk (state=3): >>><<< 22225 1726882775.54440: stderr chunk (state=3): >>><<< 22225 1726882775.54443: done transferring module to remote 22225 1726882775.54445: _low_level_execute_command(): starting 22225 1726882775.54447: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882775.4727578-23336-66785066654121/ /root/.ansible/tmp/ansible-tmp-1726882775.4727578-23336-66785066654121/AnsiballZ_service_facts.py && sleep 0' 22225 1726882775.54945: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882775.54949: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882775.54951: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882775.54954: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882775.54956: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882775.55000: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882775.55004: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882775.55008: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882775.55062: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882775.57014: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882775.57018: stdout chunk (state=3): >>><<< 22225 1726882775.57021: stderr chunk (state=3): >>><<< 22225 1726882775.57026: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882775.57028: _low_level_execute_command(): starting 22225 1726882775.57031: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882775.4727578-23336-66785066654121/AnsiballZ_service_facts.py && sleep 0' 22225 1726882775.57455: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882775.57460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882775.57463: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882775.57465: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882775.57513: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882775.57516: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882775.57581: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882777.74419: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"<<< 22225 1726882777.74437: stdout chunk (state=3): >>>name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "sour<<< 22225 1726882777.74477: stdout chunk (state=3): >>>ce": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "syst<<< 22225 1726882777.74484: stdout chunk (state=3): >>>emd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status<<< 22225 1726882777.74507: stdout chunk (state=3): >>>": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 22225 1726882777.76328: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. <<< 22225 1726882777.76332: stderr chunk (state=3): >>><<< 22225 1726882777.76334: stdout chunk (state=3): >>><<< 22225 1726882777.76341: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. 22225 1726882777.77329: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882775.4727578-23336-66785066654121/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22225 1726882777.77342: _low_level_execute_command(): starting 22225 1726882777.77348: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882775.4727578-23336-66785066654121/ > /dev/null 2>&1 && sleep 0' 22225 1726882777.78060: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22225 1726882777.78070: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882777.78094: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882777.78110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882777.78125: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 <<< 22225 1726882777.78132: stderr chunk (state=3): >>>debug2: match not found <<< 22225 1726882777.78206: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882777.78243: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882777.78408: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882777.78412: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882777.78449: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882777.80357: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882777.80406: stderr chunk (state=3): >>><<< 22225 1726882777.80409: stdout chunk (state=3): >>><<< 22225 1726882777.80423: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882777.80431: handler run complete 22225 1726882777.80566: variable 'ansible_facts' from source: unknown 22225 1726882777.80700: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882777.81037: variable 'ansible_facts' from source: unknown 22225 1726882777.81136: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882777.81288: attempt loop complete, returning result 22225 1726882777.81292: _execute() done 22225 1726882777.81294: dumping result to json 22225 1726882777.81336: done dumping result, returning 22225 1726882777.81350: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running [0affc7ec-ae25-ec05-55b7-000000000518] 22225 1726882777.81354: sending task result for task 0affc7ec-ae25-ec05-55b7-000000000518 ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 22225 1726882777.82055: done sending task result for task 0affc7ec-ae25-ec05-55b7-000000000518 22225 1726882777.82064: no more pending results, returning what we have 22225 1726882777.82066: results queue empty 22225 1726882777.82066: checking for any_errors_fatal 22225 1726882777.82071: done checking for any_errors_fatal 22225 1726882777.82071: checking for max_fail_percentage 22225 1726882777.82072: done checking for max_fail_percentage 22225 1726882777.82073: checking to see if all hosts have failed and the running result is not ok 22225 1726882777.82073: done checking to see if all hosts have failed 22225 1726882777.82074: getting the remaining hosts for this loop 22225 1726882777.82075: done getting the remaining hosts for this loop 22225 1726882777.82077: getting the next task for host managed_node1 22225 1726882777.82082: done getting next task for host managed_node1 22225 1726882777.82085: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 22225 1726882777.82088: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22225 1726882777.82096: WORKER PROCESS EXITING 22225 1726882777.82102: getting variables 22225 1726882777.82103: in VariableManager get_vars() 22225 1726882777.82130: Calling all_inventory to load vars for managed_node1 22225 1726882777.82132: Calling groups_inventory to load vars for managed_node1 22225 1726882777.82134: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882777.82141: Calling all_plugins_play to load vars for managed_node1 22225 1726882777.82142: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882777.82144: Calling groups_plugins_play to load vars for managed_node1 22225 1726882777.83133: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882777.84306: done with get_vars() 22225 1726882777.84325: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:39:37 -0400 (0:00:02.452) 0:00:33.236 ****** 22225 1726882777.84408: entering _queue_task() for managed_node1/package_facts 22225 1726882777.84649: worker is 1 (out of 1 available) 22225 1726882777.84664: exiting _queue_task() for managed_node1/package_facts 22225 1726882777.84675: done queuing things up, now waiting for results queue to drain 22225 1726882777.84677: waiting for pending results... 22225 1726882777.84866: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 22225 1726882777.84983: in run() - task 0affc7ec-ae25-ec05-55b7-000000000519 22225 1726882777.84998: variable 'ansible_search_path' from source: unknown 22225 1726882777.85001: variable 'ansible_search_path' from source: unknown 22225 1726882777.85036: calling self._execute() 22225 1726882777.85115: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882777.85119: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882777.85133: variable 'omit' from source: magic vars 22225 1726882777.85423: variable 'ansible_distribution_major_version' from source: facts 22225 1726882777.85433: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882777.85440: variable 'omit' from source: magic vars 22225 1726882777.85499: variable 'omit' from source: magic vars 22225 1726882777.85525: variable 'omit' from source: magic vars 22225 1726882777.85559: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22225 1726882777.85594: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22225 1726882777.85609: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22225 1726882777.85624: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882777.85634: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882777.85659: variable 'inventory_hostname' from source: host vars for 'managed_node1' 22225 1726882777.85663: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882777.85665: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882777.85743: Set connection var ansible_connection to ssh 22225 1726882777.85751: Set connection var ansible_pipelining to False 22225 1726882777.85758: Set connection var ansible_shell_executable to /bin/sh 22225 1726882777.85764: Set connection var ansible_timeout to 10 22225 1726882777.85767: Set connection var ansible_shell_type to sh 22225 1726882777.85772: Set connection var ansible_module_compression to ZIP_DEFLATED 22225 1726882777.85797: variable 'ansible_shell_executable' from source: unknown 22225 1726882777.85800: variable 'ansible_connection' from source: unknown 22225 1726882777.85803: variable 'ansible_module_compression' from source: unknown 22225 1726882777.85806: variable 'ansible_shell_type' from source: unknown 22225 1726882777.85808: variable 'ansible_shell_executable' from source: unknown 22225 1726882777.85810: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882777.85815: variable 'ansible_pipelining' from source: unknown 22225 1726882777.85817: variable 'ansible_timeout' from source: unknown 22225 1726882777.85823: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882777.85979: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 22225 1726882777.85990: variable 'omit' from source: magic vars 22225 1726882777.85996: starting attempt loop 22225 1726882777.86000: running the handler 22225 1726882777.86014: _low_level_execute_command(): starting 22225 1726882777.86020: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22225 1726882777.86554: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882777.86558: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882777.86563: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882777.86565: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882777.86618: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882777.86623: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882777.86628: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882777.86683: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882777.88353: stdout chunk (state=3): >>>/root <<< 22225 1726882777.88462: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882777.88511: stderr chunk (state=3): >>><<< 22225 1726882777.88515: stdout chunk (state=3): >>><<< 22225 1726882777.88537: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882777.88548: _low_level_execute_command(): starting 22225 1726882777.88553: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882777.8853612-23415-76179447968804 `" && echo ansible-tmp-1726882777.8853612-23415-76179447968804="` echo /root/.ansible/tmp/ansible-tmp-1726882777.8853612-23415-76179447968804 `" ) && sleep 0' 22225 1726882777.89024: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882777.89027: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found <<< 22225 1726882777.89030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 22225 1726882777.89039: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882777.89042: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882777.89086: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882777.89091: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882777.89147: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882777.91090: stdout chunk (state=3): >>>ansible-tmp-1726882777.8853612-23415-76179447968804=/root/.ansible/tmp/ansible-tmp-1726882777.8853612-23415-76179447968804 <<< 22225 1726882777.91208: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882777.91260: stderr chunk (state=3): >>><<< 22225 1726882777.91264: stdout chunk (state=3): >>><<< 22225 1726882777.91278: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882777.8853612-23415-76179447968804=/root/.ansible/tmp/ansible-tmp-1726882777.8853612-23415-76179447968804 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882777.91316: variable 'ansible_module_compression' from source: unknown 22225 1726882777.91357: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-2222502xecypi/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 22225 1726882777.91411: variable 'ansible_facts' from source: unknown 22225 1726882777.91529: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882777.8853612-23415-76179447968804/AnsiballZ_package_facts.py 22225 1726882777.91640: Sending initial data 22225 1726882777.91644: Sent initial data (161 bytes) 22225 1726882777.92112: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882777.92116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882777.92119: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882777.92121: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882777.92178: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882777.92185: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882777.92187: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882777.92239: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882777.93839: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 22225 1726882777.93843: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22225 1726882777.93894: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22225 1726882777.93950: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2222502xecypi/tmp14hl_3z6 /root/.ansible/tmp/ansible-tmp-1726882777.8853612-23415-76179447968804/AnsiballZ_package_facts.py <<< 22225 1726882777.93955: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882777.8853612-23415-76179447968804/AnsiballZ_package_facts.py" <<< 22225 1726882777.94001: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-2222502xecypi/tmp14hl_3z6" to remote "/root/.ansible/tmp/ansible-tmp-1726882777.8853612-23415-76179447968804/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882777.8853612-23415-76179447968804/AnsiballZ_package_facts.py" <<< 22225 1726882777.95143: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882777.95212: stderr chunk (state=3): >>><<< 22225 1726882777.95216: stdout chunk (state=3): >>><<< 22225 1726882777.95239: done transferring module to remote 22225 1726882777.95249: _low_level_execute_command(): starting 22225 1726882777.95254: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882777.8853612-23415-76179447968804/ /root/.ansible/tmp/ansible-tmp-1726882777.8853612-23415-76179447968804/AnsiballZ_package_facts.py && sleep 0' 22225 1726882777.95711: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882777.95714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882777.95717: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882777.95719: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882777.95771: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882777.95774: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882777.95832: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882777.97625: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882777.97674: stderr chunk (state=3): >>><<< 22225 1726882777.97677: stdout chunk (state=3): >>><<< 22225 1726882777.97690: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882777.97693: _low_level_execute_command(): starting 22225 1726882777.97698: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882777.8853612-23415-76179447968804/AnsiballZ_package_facts.py && sleep 0' 22225 1726882777.98152: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882777.98156: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found <<< 22225 1726882777.98158: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 22225 1726882777.98160: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882777.98162: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882777.98213: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882777.98217: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882777.98278: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882778.60590: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "<<< 22225 1726882778.60612: stdout chunk (state=3): >>>systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.<<< 22225 1726882778.60643: stdout chunk (state=3): >>>fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3<<< 22225 1726882778.60683: stdout chunk (state=3): >>>-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lib<<< 22225 1726882778.60693: stdout chunk (state=3): >>>xmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_6<<< 22225 1726882778.60726: stdout chunk (state=3): >>>4", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "<<< 22225 1726882778.60746: stdout chunk (state=3): >>>x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], <<< 22225 1726882778.60754: stdout chunk (state=3): >>>"perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoch":<<< 22225 1726882778.60783: stdout chunk (state=3): >>> null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "sour<<< 22225 1726882778.60788: stdout chunk (state=3): >>>ce": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.11", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 22225 1726882778.62713: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. <<< 22225 1726882778.62773: stderr chunk (state=3): >>><<< 22225 1726882778.62776: stdout chunk (state=3): >>><<< 22225 1726882778.62819: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.9", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.11", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. 22225 1726882778.64831: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882777.8853612-23415-76179447968804/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22225 1726882778.64837: _low_level_execute_command(): starting 22225 1726882778.64839: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882777.8853612-23415-76179447968804/ > /dev/null 2>&1 && sleep 0' 22225 1726882778.65392: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882778.65409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882778.65420: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882778.65469: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882778.65482: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882778.65548: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882778.67470: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882778.67524: stderr chunk (state=3): >>><<< 22225 1726882778.67528: stdout chunk (state=3): >>><<< 22225 1726882778.67545: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882778.67551: handler run complete 22225 1726882778.68195: variable 'ansible_facts' from source: unknown 22225 1726882778.68567: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882778.70119: variable 'ansible_facts' from source: unknown 22225 1726882778.70463: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882778.71020: attempt loop complete, returning result 22225 1726882778.71033: _execute() done 22225 1726882778.71036: dumping result to json 22225 1726882778.71182: done dumping result, returning 22225 1726882778.71193: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affc7ec-ae25-ec05-55b7-000000000519] 22225 1726882778.71198: sending task result for task 0affc7ec-ae25-ec05-55b7-000000000519 22225 1726882778.73031: done sending task result for task 0affc7ec-ae25-ec05-55b7-000000000519 22225 1726882778.73036: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 22225 1726882778.73134: no more pending results, returning what we have 22225 1726882778.73136: results queue empty 22225 1726882778.73137: checking for any_errors_fatal 22225 1726882778.73140: done checking for any_errors_fatal 22225 1726882778.73141: checking for max_fail_percentage 22225 1726882778.73142: done checking for max_fail_percentage 22225 1726882778.73143: checking to see if all hosts have failed and the running result is not ok 22225 1726882778.73143: done checking to see if all hosts have failed 22225 1726882778.73144: getting the remaining hosts for this loop 22225 1726882778.73145: done getting the remaining hosts for this loop 22225 1726882778.73147: getting the next task for host managed_node1 22225 1726882778.73152: done getting next task for host managed_node1 22225 1726882778.73155: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 22225 1726882778.73157: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22225 1726882778.73164: getting variables 22225 1726882778.73164: in VariableManager get_vars() 22225 1726882778.73195: Calling all_inventory to load vars for managed_node1 22225 1726882778.73196: Calling groups_inventory to load vars for managed_node1 22225 1726882778.73198: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882778.73205: Calling all_plugins_play to load vars for managed_node1 22225 1726882778.73207: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882778.73209: Calling groups_plugins_play to load vars for managed_node1 22225 1726882778.74118: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882778.75313: done with get_vars() 22225 1726882778.75338: done getting variables 22225 1726882778.75390: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:39:38 -0400 (0:00:00.910) 0:00:34.146 ****** 22225 1726882778.75418: entering _queue_task() for managed_node1/debug 22225 1726882778.75704: worker is 1 (out of 1 available) 22225 1726882778.75721: exiting _queue_task() for managed_node1/debug 22225 1726882778.75735: done queuing things up, now waiting for results queue to drain 22225 1726882778.75736: waiting for pending results... 22225 1726882778.75931: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 22225 1726882778.76032: in run() - task 0affc7ec-ae25-ec05-55b7-00000000006f 22225 1726882778.76047: variable 'ansible_search_path' from source: unknown 22225 1726882778.76051: variable 'ansible_search_path' from source: unknown 22225 1726882778.76086: calling self._execute() 22225 1726882778.76170: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882778.76178: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882778.76185: variable 'omit' from source: magic vars 22225 1726882778.76487: variable 'ansible_distribution_major_version' from source: facts 22225 1726882778.76495: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882778.76501: variable 'omit' from source: magic vars 22225 1726882778.76550: variable 'omit' from source: magic vars 22225 1726882778.76621: variable 'network_provider' from source: set_fact 22225 1726882778.76641: variable 'omit' from source: magic vars 22225 1726882778.76674: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22225 1726882778.76712: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22225 1726882778.76730: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22225 1726882778.76750: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882778.76758: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882778.76785: variable 'inventory_hostname' from source: host vars for 'managed_node1' 22225 1726882778.76788: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882778.76791: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882778.76863: Set connection var ansible_connection to ssh 22225 1726882778.76872: Set connection var ansible_pipelining to False 22225 1726882778.76882: Set connection var ansible_shell_executable to /bin/sh 22225 1726882778.76885: Set connection var ansible_timeout to 10 22225 1726882778.76888: Set connection var ansible_shell_type to sh 22225 1726882778.76893: Set connection var ansible_module_compression to ZIP_DEFLATED 22225 1726882778.76913: variable 'ansible_shell_executable' from source: unknown 22225 1726882778.76917: variable 'ansible_connection' from source: unknown 22225 1726882778.76919: variable 'ansible_module_compression' from source: unknown 22225 1726882778.76923: variable 'ansible_shell_type' from source: unknown 22225 1726882778.76926: variable 'ansible_shell_executable' from source: unknown 22225 1726882778.76928: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882778.76933: variable 'ansible_pipelining' from source: unknown 22225 1726882778.76935: variable 'ansible_timeout' from source: unknown 22225 1726882778.76940: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882778.77050: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22225 1726882778.77066: variable 'omit' from source: magic vars 22225 1726882778.77071: starting attempt loop 22225 1726882778.77074: running the handler 22225 1726882778.77108: handler run complete 22225 1726882778.77119: attempt loop complete, returning result 22225 1726882778.77124: _execute() done 22225 1726882778.77127: dumping result to json 22225 1726882778.77129: done dumping result, returning 22225 1726882778.77136: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [0affc7ec-ae25-ec05-55b7-00000000006f] 22225 1726882778.77142: sending task result for task 0affc7ec-ae25-ec05-55b7-00000000006f 22225 1726882778.77232: done sending task result for task 0affc7ec-ae25-ec05-55b7-00000000006f 22225 1726882778.77236: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: Using network provider: nm 22225 1726882778.77304: no more pending results, returning what we have 22225 1726882778.77307: results queue empty 22225 1726882778.77309: checking for any_errors_fatal 22225 1726882778.77321: done checking for any_errors_fatal 22225 1726882778.77323: checking for max_fail_percentage 22225 1726882778.77325: done checking for max_fail_percentage 22225 1726882778.77326: checking to see if all hosts have failed and the running result is not ok 22225 1726882778.77327: done checking to see if all hosts have failed 22225 1726882778.77328: getting the remaining hosts for this loop 22225 1726882778.77330: done getting the remaining hosts for this loop 22225 1726882778.77334: getting the next task for host managed_node1 22225 1726882778.77339: done getting next task for host managed_node1 22225 1726882778.77343: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 22225 1726882778.77346: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22225 1726882778.77357: getting variables 22225 1726882778.77359: in VariableManager get_vars() 22225 1726882778.77398: Calling all_inventory to load vars for managed_node1 22225 1726882778.77400: Calling groups_inventory to load vars for managed_node1 22225 1726882778.77402: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882778.77412: Calling all_plugins_play to load vars for managed_node1 22225 1726882778.77414: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882778.77417: Calling groups_plugins_play to load vars for managed_node1 22225 1726882778.78501: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882778.79999: done with get_vars() 22225 1726882778.80027: done getting variables 22225 1726882778.80099: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:39:38 -0400 (0:00:00.047) 0:00:34.194 ****** 22225 1726882778.80140: entering _queue_task() for managed_node1/fail 22225 1726882778.80516: worker is 1 (out of 1 available) 22225 1726882778.80534: exiting _queue_task() for managed_node1/fail 22225 1726882778.80548: done queuing things up, now waiting for results queue to drain 22225 1726882778.80550: waiting for pending results... 22225 1726882778.81145: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 22225 1726882778.81150: in run() - task 0affc7ec-ae25-ec05-55b7-000000000070 22225 1726882778.81154: variable 'ansible_search_path' from source: unknown 22225 1726882778.81157: variable 'ansible_search_path' from source: unknown 22225 1726882778.81159: calling self._execute() 22225 1726882778.81246: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882778.81259: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882778.81276: variable 'omit' from source: magic vars 22225 1726882778.81737: variable 'ansible_distribution_major_version' from source: facts 22225 1726882778.81758: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882778.81916: variable 'network_state' from source: role '' defaults 22225 1726882778.81936: Evaluated conditional (network_state != {}): False 22225 1726882778.81947: when evaluation is False, skipping this task 22225 1726882778.81955: _execute() done 22225 1726882778.81964: dumping result to json 22225 1726882778.81973: done dumping result, returning 22225 1726882778.81991: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affc7ec-ae25-ec05-55b7-000000000070] 22225 1726882778.82010: sending task result for task 0affc7ec-ae25-ec05-55b7-000000000070 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 22225 1726882778.82297: no more pending results, returning what we have 22225 1726882778.82302: results queue empty 22225 1726882778.82304: checking for any_errors_fatal 22225 1726882778.82312: done checking for any_errors_fatal 22225 1726882778.82313: checking for max_fail_percentage 22225 1726882778.82315: done checking for max_fail_percentage 22225 1726882778.82316: checking to see if all hosts have failed and the running result is not ok 22225 1726882778.82317: done checking to see if all hosts have failed 22225 1726882778.82318: getting the remaining hosts for this loop 22225 1726882778.82320: done getting the remaining hosts for this loop 22225 1726882778.82329: getting the next task for host managed_node1 22225 1726882778.82339: done getting next task for host managed_node1 22225 1726882778.82343: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 22225 1726882778.82348: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22225 1726882778.82372: getting variables 22225 1726882778.82374: in VariableManager get_vars() 22225 1726882778.82627: Calling all_inventory to load vars for managed_node1 22225 1726882778.82631: Calling groups_inventory to load vars for managed_node1 22225 1726882778.82634: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882778.82645: Calling all_plugins_play to load vars for managed_node1 22225 1726882778.82648: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882778.82652: Calling groups_plugins_play to load vars for managed_node1 22225 1726882778.83341: done sending task result for task 0affc7ec-ae25-ec05-55b7-000000000070 22225 1726882778.83351: WORKER PROCESS EXITING 22225 1726882778.84316: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882778.85968: done with get_vars() 22225 1726882778.86011: done getting variables 22225 1726882778.86089: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:39:38 -0400 (0:00:00.059) 0:00:34.253 ****** 22225 1726882778.86135: entering _queue_task() for managed_node1/fail 22225 1726882778.86540: worker is 1 (out of 1 available) 22225 1726882778.86556: exiting _queue_task() for managed_node1/fail 22225 1726882778.86569: done queuing things up, now waiting for results queue to drain 22225 1726882778.86570: waiting for pending results... 22225 1726882778.86895: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 22225 1726882778.87056: in run() - task 0affc7ec-ae25-ec05-55b7-000000000071 22225 1726882778.87081: variable 'ansible_search_path' from source: unknown 22225 1726882778.87092: variable 'ansible_search_path' from source: unknown 22225 1726882778.87138: calling self._execute() 22225 1726882778.87252: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882778.87269: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882778.87288: variable 'omit' from source: magic vars 22225 1726882778.87694: variable 'ansible_distribution_major_version' from source: facts 22225 1726882778.87717: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882778.87856: variable 'network_state' from source: role '' defaults 22225 1726882778.87874: Evaluated conditional (network_state != {}): False 22225 1726882778.87885: when evaluation is False, skipping this task 22225 1726882778.87893: _execute() done 22225 1726882778.87902: dumping result to json 22225 1726882778.87909: done dumping result, returning 22225 1726882778.87927: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affc7ec-ae25-ec05-55b7-000000000071] 22225 1726882778.87938: sending task result for task 0affc7ec-ae25-ec05-55b7-000000000071 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 22225 1726882778.88107: no more pending results, returning what we have 22225 1726882778.88111: results queue empty 22225 1726882778.88112: checking for any_errors_fatal 22225 1726882778.88124: done checking for any_errors_fatal 22225 1726882778.88125: checking for max_fail_percentage 22225 1726882778.88128: done checking for max_fail_percentage 22225 1726882778.88129: checking to see if all hosts have failed and the running result is not ok 22225 1726882778.88130: done checking to see if all hosts have failed 22225 1726882778.88130: getting the remaining hosts for this loop 22225 1726882778.88133: done getting the remaining hosts for this loop 22225 1726882778.88138: getting the next task for host managed_node1 22225 1726882778.88146: done getting next task for host managed_node1 22225 1726882778.88151: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 22225 1726882778.88155: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22225 1726882778.88182: getting variables 22225 1726882778.88185: in VariableManager get_vars() 22225 1726882778.88535: Calling all_inventory to load vars for managed_node1 22225 1726882778.88539: Calling groups_inventory to load vars for managed_node1 22225 1726882778.88542: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882778.88553: Calling all_plugins_play to load vars for managed_node1 22225 1726882778.88557: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882778.88560: Calling groups_plugins_play to load vars for managed_node1 22225 1726882778.89208: done sending task result for task 0affc7ec-ae25-ec05-55b7-000000000071 22225 1726882778.89212: WORKER PROCESS EXITING 22225 1726882778.89952: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882778.91197: done with get_vars() 22225 1726882778.91223: done getting variables 22225 1726882778.91287: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:39:38 -0400 (0:00:00.051) 0:00:34.305 ****** 22225 1726882778.91319: entering _queue_task() for managed_node1/fail 22225 1726882778.91687: worker is 1 (out of 1 available) 22225 1726882778.91702: exiting _queue_task() for managed_node1/fail 22225 1726882778.91716: done queuing things up, now waiting for results queue to drain 22225 1726882778.91717: waiting for pending results... 22225 1726882778.92038: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 22225 1726882778.92205: in run() - task 0affc7ec-ae25-ec05-55b7-000000000072 22225 1726882778.92231: variable 'ansible_search_path' from source: unknown 22225 1726882778.92242: variable 'ansible_search_path' from source: unknown 22225 1726882778.92294: calling self._execute() 22225 1726882778.92434: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882778.92438: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882778.92442: variable 'omit' from source: magic vars 22225 1726882778.92747: variable 'ansible_distribution_major_version' from source: facts 22225 1726882778.92757: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882778.92896: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22225 1726882778.94728: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22225 1726882778.94732: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22225 1726882778.94762: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22225 1726882778.94809: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22225 1726882778.94846: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22225 1726882778.94941: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22225 1726882778.94986: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22225 1726882778.95017: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22225 1726882778.95064: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22225 1726882778.95085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22225 1726882778.95198: variable 'ansible_distribution_major_version' from source: facts 22225 1726882778.95219: Evaluated conditional (ansible_distribution_major_version | int > 9): True 22225 1726882778.95353: variable 'ansible_distribution' from source: facts 22225 1726882778.95363: variable '__network_rh_distros' from source: role '' defaults 22225 1726882778.95382: Evaluated conditional (ansible_distribution in __network_rh_distros): False 22225 1726882778.95527: when evaluation is False, skipping this task 22225 1726882778.95530: _execute() done 22225 1726882778.95533: dumping result to json 22225 1726882778.95536: done dumping result, returning 22225 1726882778.95539: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affc7ec-ae25-ec05-55b7-000000000072] 22225 1726882778.95541: sending task result for task 0affc7ec-ae25-ec05-55b7-000000000072 22225 1726882778.95619: done sending task result for task 0affc7ec-ae25-ec05-55b7-000000000072 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 22225 1726882778.95673: no more pending results, returning what we have 22225 1726882778.95676: results queue empty 22225 1726882778.95677: checking for any_errors_fatal 22225 1726882778.95683: done checking for any_errors_fatal 22225 1726882778.95684: checking for max_fail_percentage 22225 1726882778.95686: done checking for max_fail_percentage 22225 1726882778.95687: checking to see if all hosts have failed and the running result is not ok 22225 1726882778.95692: done checking to see if all hosts have failed 22225 1726882778.95693: getting the remaining hosts for this loop 22225 1726882778.95696: done getting the remaining hosts for this loop 22225 1726882778.95700: getting the next task for host managed_node1 22225 1726882778.95708: done getting next task for host managed_node1 22225 1726882778.95712: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 22225 1726882778.95715: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22225 1726882778.95807: WORKER PROCESS EXITING 22225 1726882778.95819: getting variables 22225 1726882778.95821: in VariableManager get_vars() 22225 1726882778.95861: Calling all_inventory to load vars for managed_node1 22225 1726882778.95864: Calling groups_inventory to load vars for managed_node1 22225 1726882778.95866: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882778.95875: Calling all_plugins_play to load vars for managed_node1 22225 1726882778.95877: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882778.95880: Calling groups_plugins_play to load vars for managed_node1 22225 1726882778.97710: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882778.99905: done with get_vars() 22225 1726882778.99945: done getting variables 22225 1726882779.00015: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:39:38 -0400 (0:00:00.087) 0:00:34.393 ****** 22225 1726882779.00055: entering _queue_task() for managed_node1/dnf 22225 1726882779.00456: worker is 1 (out of 1 available) 22225 1726882779.00472: exiting _queue_task() for managed_node1/dnf 22225 1726882779.00488: done queuing things up, now waiting for results queue to drain 22225 1726882779.00490: waiting for pending results... 22225 1726882779.00810: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 22225 1726882779.00982: in run() - task 0affc7ec-ae25-ec05-55b7-000000000073 22225 1726882779.01004: variable 'ansible_search_path' from source: unknown 22225 1726882779.01012: variable 'ansible_search_path' from source: unknown 22225 1726882779.01063: calling self._execute() 22225 1726882779.01176: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882779.01192: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882779.01328: variable 'omit' from source: magic vars 22225 1726882779.01633: variable 'ansible_distribution_major_version' from source: facts 22225 1726882779.01653: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882779.01893: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22225 1726882779.04313: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22225 1726882779.04402: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22225 1726882779.04451: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22225 1726882779.04727: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22225 1726882779.04731: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22225 1726882779.04734: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22225 1726882779.04737: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22225 1726882779.04739: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22225 1726882779.04756: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22225 1726882779.04778: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22225 1726882779.04926: variable 'ansible_distribution' from source: facts 22225 1726882779.04944: variable 'ansible_distribution_major_version' from source: facts 22225 1726882779.04958: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 22225 1726882779.05127: variable '__network_wireless_connections_defined' from source: role '' defaults 22225 1726882779.05256: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22225 1726882779.05294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22225 1726882779.05328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22225 1726882779.05376: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22225 1726882779.05406: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22225 1726882779.05508: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22225 1726882779.05512: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22225 1726882779.05515: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22225 1726882779.05576: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22225 1726882779.05579: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22225 1726882779.05615: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22225 1726882779.05642: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22225 1726882779.05663: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22225 1726882779.05691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22225 1726882779.05702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22225 1726882779.05814: variable 'network_connections' from source: task vars 22225 1726882779.05825: variable 'interface' from source: play vars 22225 1726882779.05878: variable 'interface' from source: play vars 22225 1726882779.05955: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22225 1726882779.06077: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22225 1726882779.06107: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22225 1726882779.06132: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22225 1726882779.06161: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22225 1726882779.06196: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22225 1726882779.06214: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22225 1726882779.06238: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22225 1726882779.06256: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22225 1726882779.06297: variable '__network_team_connections_defined' from source: role '' defaults 22225 1726882779.06476: variable 'network_connections' from source: task vars 22225 1726882779.06482: variable 'interface' from source: play vars 22225 1726882779.06529: variable 'interface' from source: play vars 22225 1726882779.06547: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 22225 1726882779.06550: when evaluation is False, skipping this task 22225 1726882779.06553: _execute() done 22225 1726882779.06555: dumping result to json 22225 1726882779.06558: done dumping result, returning 22225 1726882779.06567: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affc7ec-ae25-ec05-55b7-000000000073] 22225 1726882779.06571: sending task result for task 0affc7ec-ae25-ec05-55b7-000000000073 22225 1726882779.06671: done sending task result for task 0affc7ec-ae25-ec05-55b7-000000000073 22225 1726882779.06674: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 22225 1726882779.06731: no more pending results, returning what we have 22225 1726882779.06735: results queue empty 22225 1726882779.06736: checking for any_errors_fatal 22225 1726882779.06742: done checking for any_errors_fatal 22225 1726882779.06743: checking for max_fail_percentage 22225 1726882779.06745: done checking for max_fail_percentage 22225 1726882779.06746: checking to see if all hosts have failed and the running result is not ok 22225 1726882779.06747: done checking to see if all hosts have failed 22225 1726882779.06747: getting the remaining hosts for this loop 22225 1726882779.06750: done getting the remaining hosts for this loop 22225 1726882779.06754: getting the next task for host managed_node1 22225 1726882779.06761: done getting next task for host managed_node1 22225 1726882779.06766: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 22225 1726882779.06769: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22225 1726882779.06791: getting variables 22225 1726882779.06793: in VariableManager get_vars() 22225 1726882779.06841: Calling all_inventory to load vars for managed_node1 22225 1726882779.06844: Calling groups_inventory to load vars for managed_node1 22225 1726882779.06847: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882779.06856: Calling all_plugins_play to load vars for managed_node1 22225 1726882779.06859: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882779.06861: Calling groups_plugins_play to load vars for managed_node1 22225 1726882779.07996: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882779.09432: done with get_vars() 22225 1726882779.09453: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 22225 1726882779.09514: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:39:39 -0400 (0:00:00.094) 0:00:34.488 ****** 22225 1726882779.09542: entering _queue_task() for managed_node1/yum 22225 1726882779.09814: worker is 1 (out of 1 available) 22225 1726882779.09833: exiting _queue_task() for managed_node1/yum 22225 1726882779.09846: done queuing things up, now waiting for results queue to drain 22225 1726882779.09847: waiting for pending results... 22225 1726882779.10048: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 22225 1726882779.10146: in run() - task 0affc7ec-ae25-ec05-55b7-000000000074 22225 1726882779.10158: variable 'ansible_search_path' from source: unknown 22225 1726882779.10162: variable 'ansible_search_path' from source: unknown 22225 1726882779.10198: calling self._execute() 22225 1726882779.10276: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882779.10283: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882779.10293: variable 'omit' from source: magic vars 22225 1726882779.10593: variable 'ansible_distribution_major_version' from source: facts 22225 1726882779.10603: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882779.10743: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22225 1726882779.12576: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22225 1726882779.12628: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22225 1726882779.12656: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22225 1726882779.12685: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22225 1726882779.12705: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22225 1726882779.12770: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22225 1726882779.12791: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22225 1726882779.12813: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22225 1726882779.12845: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22225 1726882779.12856: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22225 1726882779.12930: variable 'ansible_distribution_major_version' from source: facts 22225 1726882779.12942: Evaluated conditional (ansible_distribution_major_version | int < 8): False 22225 1726882779.12945: when evaluation is False, skipping this task 22225 1726882779.12948: _execute() done 22225 1726882779.12951: dumping result to json 22225 1726882779.12953: done dumping result, returning 22225 1726882779.12961: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affc7ec-ae25-ec05-55b7-000000000074] 22225 1726882779.12965: sending task result for task 0affc7ec-ae25-ec05-55b7-000000000074 22225 1726882779.13061: done sending task result for task 0affc7ec-ae25-ec05-55b7-000000000074 22225 1726882779.13064: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 22225 1726882779.13118: no more pending results, returning what we have 22225 1726882779.13124: results queue empty 22225 1726882779.13125: checking for any_errors_fatal 22225 1726882779.13131: done checking for any_errors_fatal 22225 1726882779.13131: checking for max_fail_percentage 22225 1726882779.13133: done checking for max_fail_percentage 22225 1726882779.13134: checking to see if all hosts have failed and the running result is not ok 22225 1726882779.13135: done checking to see if all hosts have failed 22225 1726882779.13136: getting the remaining hosts for this loop 22225 1726882779.13138: done getting the remaining hosts for this loop 22225 1726882779.13142: getting the next task for host managed_node1 22225 1726882779.13150: done getting next task for host managed_node1 22225 1726882779.13153: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 22225 1726882779.13157: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22225 1726882779.13189: getting variables 22225 1726882779.13191: in VariableManager get_vars() 22225 1726882779.13239: Calling all_inventory to load vars for managed_node1 22225 1726882779.13242: Calling groups_inventory to load vars for managed_node1 22225 1726882779.13244: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882779.13253: Calling all_plugins_play to load vars for managed_node1 22225 1726882779.13256: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882779.13259: Calling groups_plugins_play to load vars for managed_node1 22225 1726882779.14354: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882779.15985: done with get_vars() 22225 1726882779.16012: done getting variables 22225 1726882779.16072: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:39:39 -0400 (0:00:00.065) 0:00:34.553 ****** 22225 1726882779.16108: entering _queue_task() for managed_node1/fail 22225 1726882779.16458: worker is 1 (out of 1 available) 22225 1726882779.16471: exiting _queue_task() for managed_node1/fail 22225 1726882779.16486: done queuing things up, now waiting for results queue to drain 22225 1726882779.16488: waiting for pending results... 22225 1726882779.16857: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 22225 1726882779.16929: in run() - task 0affc7ec-ae25-ec05-55b7-000000000075 22225 1726882779.16940: variable 'ansible_search_path' from source: unknown 22225 1726882779.16949: variable 'ansible_search_path' from source: unknown 22225 1726882779.16977: calling self._execute() 22225 1726882779.17059: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882779.17062: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882779.17070: variable 'omit' from source: magic vars 22225 1726882779.17372: variable 'ansible_distribution_major_version' from source: facts 22225 1726882779.17387: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882779.17471: variable '__network_wireless_connections_defined' from source: role '' defaults 22225 1726882779.17618: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22225 1726882779.19211: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22225 1726882779.19262: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22225 1726882779.19292: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22225 1726882779.19319: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22225 1726882779.19341: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22225 1726882779.19407: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22225 1726882779.19442: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22225 1726882779.19461: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22225 1726882779.19492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22225 1726882779.19504: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22225 1726882779.19541: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22225 1726882779.19559: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22225 1726882779.19577: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22225 1726882779.19607: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22225 1726882779.19618: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22225 1726882779.19651: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22225 1726882779.19669: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22225 1726882779.19687: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22225 1726882779.19716: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22225 1726882779.19728: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22225 1726882779.19852: variable 'network_connections' from source: task vars 22225 1726882779.19862: variable 'interface' from source: play vars 22225 1726882779.19912: variable 'interface' from source: play vars 22225 1726882779.19967: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22225 1726882779.20086: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22225 1726882779.20114: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22225 1726882779.20140: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22225 1726882779.20161: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22225 1726882779.20195: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22225 1726882779.20212: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22225 1726882779.20233: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22225 1726882779.20253: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22225 1726882779.20291: variable '__network_team_connections_defined' from source: role '' defaults 22225 1726882779.20459: variable 'network_connections' from source: task vars 22225 1726882779.20463: variable 'interface' from source: play vars 22225 1726882779.20508: variable 'interface' from source: play vars 22225 1726882779.20528: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 22225 1726882779.20532: when evaluation is False, skipping this task 22225 1726882779.20535: _execute() done 22225 1726882779.20538: dumping result to json 22225 1726882779.20540: done dumping result, returning 22225 1726882779.20548: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affc7ec-ae25-ec05-55b7-000000000075] 22225 1726882779.20553: sending task result for task 0affc7ec-ae25-ec05-55b7-000000000075 22225 1726882779.20649: done sending task result for task 0affc7ec-ae25-ec05-55b7-000000000075 22225 1726882779.20651: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 22225 1726882779.20704: no more pending results, returning what we have 22225 1726882779.20707: results queue empty 22225 1726882779.20709: checking for any_errors_fatal 22225 1726882779.20716: done checking for any_errors_fatal 22225 1726882779.20717: checking for max_fail_percentage 22225 1726882779.20719: done checking for max_fail_percentage 22225 1726882779.20720: checking to see if all hosts have failed and the running result is not ok 22225 1726882779.20721: done checking to see if all hosts have failed 22225 1726882779.20723: getting the remaining hosts for this loop 22225 1726882779.20725: done getting the remaining hosts for this loop 22225 1726882779.20729: getting the next task for host managed_node1 22225 1726882779.20738: done getting next task for host managed_node1 22225 1726882779.20742: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 22225 1726882779.20745: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22225 1726882779.20764: getting variables 22225 1726882779.20766: in VariableManager get_vars() 22225 1726882779.20806: Calling all_inventory to load vars for managed_node1 22225 1726882779.20809: Calling groups_inventory to load vars for managed_node1 22225 1726882779.20812: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882779.20824: Calling all_plugins_play to load vars for managed_node1 22225 1726882779.20827: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882779.20830: Calling groups_plugins_play to load vars for managed_node1 22225 1726882779.21918: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882779.23061: done with get_vars() 22225 1726882779.23078: done getting variables 22225 1726882779.23127: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:39:39 -0400 (0:00:00.070) 0:00:34.624 ****** 22225 1726882779.23153: entering _queue_task() for managed_node1/package 22225 1726882779.23402: worker is 1 (out of 1 available) 22225 1726882779.23418: exiting _queue_task() for managed_node1/package 22225 1726882779.23431: done queuing things up, now waiting for results queue to drain 22225 1726882779.23433: waiting for pending results... 22225 1726882779.23637: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 22225 1726882779.23723: in run() - task 0affc7ec-ae25-ec05-55b7-000000000076 22225 1726882779.23735: variable 'ansible_search_path' from source: unknown 22225 1726882779.23739: variable 'ansible_search_path' from source: unknown 22225 1726882779.23771: calling self._execute() 22225 1726882779.23849: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882779.23853: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882779.23862: variable 'omit' from source: magic vars 22225 1726882779.24175: variable 'ansible_distribution_major_version' from source: facts 22225 1726882779.24187: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882779.24340: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22225 1726882779.24538: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22225 1726882779.24572: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22225 1726882779.24601: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22225 1726882779.24655: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22225 1726882779.24737: variable 'network_packages' from source: role '' defaults 22225 1726882779.24816: variable '__network_provider_setup' from source: role '' defaults 22225 1726882779.24825: variable '__network_service_name_default_nm' from source: role '' defaults 22225 1726882779.24876: variable '__network_service_name_default_nm' from source: role '' defaults 22225 1726882779.24885: variable '__network_packages_default_nm' from source: role '' defaults 22225 1726882779.24932: variable '__network_packages_default_nm' from source: role '' defaults 22225 1726882779.25061: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22225 1726882779.26525: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22225 1726882779.26573: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22225 1726882779.26605: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22225 1726882779.26633: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22225 1726882779.26653: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22225 1726882779.26721: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22225 1726882779.26744: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22225 1726882779.26763: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22225 1726882779.26794: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22225 1726882779.26805: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22225 1726882779.26845: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22225 1726882779.26862: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22225 1726882779.26879: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22225 1726882779.26910: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22225 1726882779.26926: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22225 1726882779.27080: variable '__network_packages_default_gobject_packages' from source: role '' defaults 22225 1726882779.27163: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22225 1726882779.27180: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22225 1726882779.27200: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22225 1726882779.27228: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22225 1726882779.27240: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22225 1726882779.27307: variable 'ansible_python' from source: facts 22225 1726882779.27329: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 22225 1726882779.27393: variable '__network_wpa_supplicant_required' from source: role '' defaults 22225 1726882779.27450: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 22225 1726882779.27553: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22225 1726882779.27572: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22225 1726882779.27594: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22225 1726882779.27621: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22225 1726882779.27635: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22225 1726882779.27670: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22225 1726882779.27694: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22225 1726882779.27712: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22225 1726882779.27741: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22225 1726882779.27752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22225 1726882779.27857: variable 'network_connections' from source: task vars 22225 1726882779.27862: variable 'interface' from source: play vars 22225 1726882779.27939: variable 'interface' from source: play vars 22225 1726882779.27994: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22225 1726882779.28017: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22225 1726882779.28041: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22225 1726882779.28063: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22225 1726882779.28101: variable '__network_wireless_connections_defined' from source: role '' defaults 22225 1726882779.28294: variable 'network_connections' from source: task vars 22225 1726882779.28298: variable 'interface' from source: play vars 22225 1726882779.28372: variable 'interface' from source: play vars 22225 1726882779.28398: variable '__network_packages_default_wireless' from source: role '' defaults 22225 1726882779.28458: variable '__network_wireless_connections_defined' from source: role '' defaults 22225 1726882779.28668: variable 'network_connections' from source: task vars 22225 1726882779.28671: variable 'interface' from source: play vars 22225 1726882779.28717: variable 'interface' from source: play vars 22225 1726882779.28736: variable '__network_packages_default_team' from source: role '' defaults 22225 1726882779.28795: variable '__network_team_connections_defined' from source: role '' defaults 22225 1726882779.29007: variable 'network_connections' from source: task vars 22225 1726882779.29010: variable 'interface' from source: play vars 22225 1726882779.29060: variable 'interface' from source: play vars 22225 1726882779.29104: variable '__network_service_name_default_initscripts' from source: role '' defaults 22225 1726882779.29146: variable '__network_service_name_default_initscripts' from source: role '' defaults 22225 1726882779.29152: variable '__network_packages_default_initscripts' from source: role '' defaults 22225 1726882779.29199: variable '__network_packages_default_initscripts' from source: role '' defaults 22225 1726882779.29347: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 22225 1726882779.29666: variable 'network_connections' from source: task vars 22225 1726882779.29670: variable 'interface' from source: play vars 22225 1726882779.29717: variable 'interface' from source: play vars 22225 1726882779.29725: variable 'ansible_distribution' from source: facts 22225 1726882779.29728: variable '__network_rh_distros' from source: role '' defaults 22225 1726882779.29735: variable 'ansible_distribution_major_version' from source: facts 22225 1726882779.29746: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 22225 1726882779.29863: variable 'ansible_distribution' from source: facts 22225 1726882779.29867: variable '__network_rh_distros' from source: role '' defaults 22225 1726882779.29870: variable 'ansible_distribution_major_version' from source: facts 22225 1726882779.29878: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 22225 1726882779.29996: variable 'ansible_distribution' from source: facts 22225 1726882779.29999: variable '__network_rh_distros' from source: role '' defaults 22225 1726882779.30005: variable 'ansible_distribution_major_version' from source: facts 22225 1726882779.30038: variable 'network_provider' from source: set_fact 22225 1726882779.30051: variable 'ansible_facts' from source: unknown 22225 1726882779.30633: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 22225 1726882779.30637: when evaluation is False, skipping this task 22225 1726882779.30640: _execute() done 22225 1726882779.30642: dumping result to json 22225 1726882779.30648: done dumping result, returning 22225 1726882779.30657: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [0affc7ec-ae25-ec05-55b7-000000000076] 22225 1726882779.30661: sending task result for task 0affc7ec-ae25-ec05-55b7-000000000076 22225 1726882779.30783: done sending task result for task 0affc7ec-ae25-ec05-55b7-000000000076 22225 1726882779.30786: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 22225 1726882779.30838: no more pending results, returning what we have 22225 1726882779.30842: results queue empty 22225 1726882779.30843: checking for any_errors_fatal 22225 1726882779.30851: done checking for any_errors_fatal 22225 1726882779.30851: checking for max_fail_percentage 22225 1726882779.30853: done checking for max_fail_percentage 22225 1726882779.30854: checking to see if all hosts have failed and the running result is not ok 22225 1726882779.30855: done checking to see if all hosts have failed 22225 1726882779.30856: getting the remaining hosts for this loop 22225 1726882779.30858: done getting the remaining hosts for this loop 22225 1726882779.30862: getting the next task for host managed_node1 22225 1726882779.30869: done getting next task for host managed_node1 22225 1726882779.30874: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 22225 1726882779.30877: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22225 1726882779.30898: getting variables 22225 1726882779.30900: in VariableManager get_vars() 22225 1726882779.30941: Calling all_inventory to load vars for managed_node1 22225 1726882779.30944: Calling groups_inventory to load vars for managed_node1 22225 1726882779.30946: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882779.30955: Calling all_plugins_play to load vars for managed_node1 22225 1726882779.30958: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882779.30960: Calling groups_plugins_play to load vars for managed_node1 22225 1726882779.32429: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882779.33621: done with get_vars() 22225 1726882779.33644: done getting variables 22225 1726882779.33705: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:39:39 -0400 (0:00:00.105) 0:00:34.730 ****** 22225 1726882779.33740: entering _queue_task() for managed_node1/package 22225 1726882779.34047: worker is 1 (out of 1 available) 22225 1726882779.34062: exiting _queue_task() for managed_node1/package 22225 1726882779.34074: done queuing things up, now waiting for results queue to drain 22225 1726882779.34075: waiting for pending results... 22225 1726882779.34274: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 22225 1726882779.34373: in run() - task 0affc7ec-ae25-ec05-55b7-000000000077 22225 1726882779.34389: variable 'ansible_search_path' from source: unknown 22225 1726882779.34394: variable 'ansible_search_path' from source: unknown 22225 1726882779.34426: calling self._execute() 22225 1726882779.34505: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882779.34509: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882779.34521: variable 'omit' from source: magic vars 22225 1726882779.35027: variable 'ansible_distribution_major_version' from source: facts 22225 1726882779.35030: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882779.35032: variable 'network_state' from source: role '' defaults 22225 1726882779.35034: Evaluated conditional (network_state != {}): False 22225 1726882779.35037: when evaluation is False, skipping this task 22225 1726882779.35042: _execute() done 22225 1726882779.35045: dumping result to json 22225 1726882779.35047: done dumping result, returning 22225 1726882779.35049: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affc7ec-ae25-ec05-55b7-000000000077] 22225 1726882779.35052: sending task result for task 0affc7ec-ae25-ec05-55b7-000000000077 22225 1726882779.35147: done sending task result for task 0affc7ec-ae25-ec05-55b7-000000000077 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 22225 1726882779.35199: no more pending results, returning what we have 22225 1726882779.35204: results queue empty 22225 1726882779.35205: checking for any_errors_fatal 22225 1726882779.35216: done checking for any_errors_fatal 22225 1726882779.35217: checking for max_fail_percentage 22225 1726882779.35219: done checking for max_fail_percentage 22225 1726882779.35220: checking to see if all hosts have failed and the running result is not ok 22225 1726882779.35220: done checking to see if all hosts have failed 22225 1726882779.35221: getting the remaining hosts for this loop 22225 1726882779.35226: done getting the remaining hosts for this loop 22225 1726882779.35231: getting the next task for host managed_node1 22225 1726882779.35239: done getting next task for host managed_node1 22225 1726882779.35243: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 22225 1726882779.35246: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22225 1726882779.35275: getting variables 22225 1726882779.35277: in VariableManager get_vars() 22225 1726882779.35320: Calling all_inventory to load vars for managed_node1 22225 1726882779.35328: Calling groups_inventory to load vars for managed_node1 22225 1726882779.35331: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882779.35345: Calling all_plugins_play to load vars for managed_node1 22225 1726882779.35348: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882779.35351: Calling groups_plugins_play to load vars for managed_node1 22225 1726882779.35870: WORKER PROCESS EXITING 22225 1726882779.39927: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882779.41092: done with get_vars() 22225 1726882779.41115: done getting variables 22225 1726882779.41156: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:39:39 -0400 (0:00:00.074) 0:00:34.804 ****** 22225 1726882779.41182: entering _queue_task() for managed_node1/package 22225 1726882779.41471: worker is 1 (out of 1 available) 22225 1726882779.41487: exiting _queue_task() for managed_node1/package 22225 1726882779.41499: done queuing things up, now waiting for results queue to drain 22225 1726882779.41501: waiting for pending results... 22225 1726882779.41842: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 22225 1726882779.41874: in run() - task 0affc7ec-ae25-ec05-55b7-000000000078 22225 1726882779.41898: variable 'ansible_search_path' from source: unknown 22225 1726882779.41906: variable 'ansible_search_path' from source: unknown 22225 1726882779.41950: calling self._execute() 22225 1726882779.42059: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882779.42071: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882779.42086: variable 'omit' from source: magic vars 22225 1726882779.42467: variable 'ansible_distribution_major_version' from source: facts 22225 1726882779.42489: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882779.42613: variable 'network_state' from source: role '' defaults 22225 1726882779.42638: Evaluated conditional (network_state != {}): False 22225 1726882779.42645: when evaluation is False, skipping this task 22225 1726882779.42652: _execute() done 22225 1726882779.42660: dumping result to json 22225 1726882779.42666: done dumping result, returning 22225 1726882779.42676: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affc7ec-ae25-ec05-55b7-000000000078] 22225 1726882779.42689: sending task result for task 0affc7ec-ae25-ec05-55b7-000000000078 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 22225 1726882779.42869: no more pending results, returning what we have 22225 1726882779.42873: results queue empty 22225 1726882779.42874: checking for any_errors_fatal 22225 1726882779.42884: done checking for any_errors_fatal 22225 1726882779.42885: checking for max_fail_percentage 22225 1726882779.42886: done checking for max_fail_percentage 22225 1726882779.42887: checking to see if all hosts have failed and the running result is not ok 22225 1726882779.42888: done checking to see if all hosts have failed 22225 1726882779.42889: getting the remaining hosts for this loop 22225 1726882779.42891: done getting the remaining hosts for this loop 22225 1726882779.42896: getting the next task for host managed_node1 22225 1726882779.42904: done getting next task for host managed_node1 22225 1726882779.42907: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 22225 1726882779.43124: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22225 1726882779.43146: getting variables 22225 1726882779.43148: in VariableManager get_vars() 22225 1726882779.43184: Calling all_inventory to load vars for managed_node1 22225 1726882779.43186: Calling groups_inventory to load vars for managed_node1 22225 1726882779.43189: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882779.43198: Calling all_plugins_play to load vars for managed_node1 22225 1726882779.43201: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882779.43205: Calling groups_plugins_play to load vars for managed_node1 22225 1726882779.43724: done sending task result for task 0affc7ec-ae25-ec05-55b7-000000000078 22225 1726882779.43729: WORKER PROCESS EXITING 22225 1726882779.44443: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882779.45620: done with get_vars() 22225 1726882779.45640: done getting variables 22225 1726882779.45688: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:39:39 -0400 (0:00:00.045) 0:00:34.849 ****** 22225 1726882779.45715: entering _queue_task() for managed_node1/service 22225 1726882779.45977: worker is 1 (out of 1 available) 22225 1726882779.45995: exiting _queue_task() for managed_node1/service 22225 1726882779.46007: done queuing things up, now waiting for results queue to drain 22225 1726882779.46009: waiting for pending results... 22225 1726882779.46201: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 22225 1726882779.46302: in run() - task 0affc7ec-ae25-ec05-55b7-000000000079 22225 1726882779.46316: variable 'ansible_search_path' from source: unknown 22225 1726882779.46319: variable 'ansible_search_path' from source: unknown 22225 1726882779.46354: calling self._execute() 22225 1726882779.46435: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882779.46440: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882779.46449: variable 'omit' from source: magic vars 22225 1726882779.46753: variable 'ansible_distribution_major_version' from source: facts 22225 1726882779.46763: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882779.46859: variable '__network_wireless_connections_defined' from source: role '' defaults 22225 1726882779.47011: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22225 1726882779.48646: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22225 1726882779.48704: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22225 1726882779.48735: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22225 1726882779.48763: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22225 1726882779.48786: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22225 1726882779.48863: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22225 1726882779.48877: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22225 1726882779.48899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22225 1726882779.48928: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22225 1726882779.48940: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22225 1726882779.48979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22225 1726882779.48999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22225 1726882779.49017: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22225 1726882779.49046: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22225 1726882779.49057: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22225 1726882779.49093: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22225 1726882779.49110: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22225 1726882779.49130: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22225 1726882779.49156: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22225 1726882779.49167: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22225 1726882779.49295: variable 'network_connections' from source: task vars 22225 1726882779.49303: variable 'interface' from source: play vars 22225 1726882779.49353: variable 'interface' from source: play vars 22225 1726882779.49413: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22225 1726882779.49532: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22225 1726882779.49568: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22225 1726882779.49594: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22225 1726882779.49622: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22225 1726882779.49655: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22225 1726882779.49671: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22225 1726882779.49692: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22225 1726882779.49710: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22225 1726882779.49754: variable '__network_team_connections_defined' from source: role '' defaults 22225 1726882779.49920: variable 'network_connections' from source: task vars 22225 1726882779.49926: variable 'interface' from source: play vars 22225 1726882779.49974: variable 'interface' from source: play vars 22225 1726882779.49994: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 22225 1726882779.49998: when evaluation is False, skipping this task 22225 1726882779.50001: _execute() done 22225 1726882779.50004: dumping result to json 22225 1726882779.50006: done dumping result, returning 22225 1726882779.50012: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affc7ec-ae25-ec05-55b7-000000000079] 22225 1726882779.50018: sending task result for task 0affc7ec-ae25-ec05-55b7-000000000079 22225 1726882779.50116: done sending task result for task 0affc7ec-ae25-ec05-55b7-000000000079 22225 1726882779.50127: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 22225 1726882779.50174: no more pending results, returning what we have 22225 1726882779.50178: results queue empty 22225 1726882779.50179: checking for any_errors_fatal 22225 1726882779.50186: done checking for any_errors_fatal 22225 1726882779.50187: checking for max_fail_percentage 22225 1726882779.50189: done checking for max_fail_percentage 22225 1726882779.50190: checking to see if all hosts have failed and the running result is not ok 22225 1726882779.50190: done checking to see if all hosts have failed 22225 1726882779.50191: getting the remaining hosts for this loop 22225 1726882779.50193: done getting the remaining hosts for this loop 22225 1726882779.50198: getting the next task for host managed_node1 22225 1726882779.50205: done getting next task for host managed_node1 22225 1726882779.50209: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 22225 1726882779.50212: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22225 1726882779.50233: getting variables 22225 1726882779.50234: in VariableManager get_vars() 22225 1726882779.50276: Calling all_inventory to load vars for managed_node1 22225 1726882779.50279: Calling groups_inventory to load vars for managed_node1 22225 1726882779.50281: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882779.50290: Calling all_plugins_play to load vars for managed_node1 22225 1726882779.50293: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882779.50295: Calling groups_plugins_play to load vars for managed_node1 22225 1726882779.51325: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882779.52557: done with get_vars() 22225 1726882779.52576: done getting variables 22225 1726882779.52626: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:39:39 -0400 (0:00:00.069) 0:00:34.919 ****** 22225 1726882779.52651: entering _queue_task() for managed_node1/service 22225 1726882779.52909: worker is 1 (out of 1 available) 22225 1726882779.52927: exiting _queue_task() for managed_node1/service 22225 1726882779.52940: done queuing things up, now waiting for results queue to drain 22225 1726882779.52941: waiting for pending results... 22225 1726882779.53137: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 22225 1726882779.53257: in run() - task 0affc7ec-ae25-ec05-55b7-00000000007a 22225 1726882779.53269: variable 'ansible_search_path' from source: unknown 22225 1726882779.53275: variable 'ansible_search_path' from source: unknown 22225 1726882779.53310: calling self._execute() 22225 1726882779.53393: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882779.53398: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882779.53408: variable 'omit' from source: magic vars 22225 1726882779.53827: variable 'ansible_distribution_major_version' from source: facts 22225 1726882779.53831: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882779.53976: variable 'network_provider' from source: set_fact 22225 1726882779.53990: variable 'network_state' from source: role '' defaults 22225 1726882779.54005: Evaluated conditional (network_provider == "nm" or network_state != {}): True 22225 1726882779.54016: variable 'omit' from source: magic vars 22225 1726882779.54084: variable 'omit' from source: magic vars 22225 1726882779.54119: variable 'network_service_name' from source: role '' defaults 22225 1726882779.54195: variable 'network_service_name' from source: role '' defaults 22225 1726882779.54310: variable '__network_provider_setup' from source: role '' defaults 22225 1726882779.54428: variable '__network_service_name_default_nm' from source: role '' defaults 22225 1726882779.54432: variable '__network_service_name_default_nm' from source: role '' defaults 22225 1726882779.54435: variable '__network_packages_default_nm' from source: role '' defaults 22225 1726882779.54478: variable '__network_packages_default_nm' from source: role '' defaults 22225 1726882779.54730: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22225 1726882779.56689: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22225 1726882779.56745: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22225 1726882779.56773: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22225 1726882779.56803: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22225 1726882779.56829: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22225 1726882779.56894: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22225 1726882779.56923: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22225 1726882779.56942: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22225 1726882779.56970: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22225 1726882779.56984: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22225 1726882779.57023: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22225 1726882779.57041: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22225 1726882779.57061: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22225 1726882779.57230: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22225 1726882779.57235: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22225 1726882779.57365: variable '__network_packages_default_gobject_packages' from source: role '' defaults 22225 1726882779.57491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22225 1726882779.57529: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22225 1726882779.57559: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22225 1726882779.57603: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22225 1726882779.57626: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22225 1726882779.57724: variable 'ansible_python' from source: facts 22225 1726882779.57751: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 22225 1726882779.57840: variable '__network_wpa_supplicant_required' from source: role '' defaults 22225 1726882779.57932: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 22225 1726882779.58072: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22225 1726882779.58103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22225 1726882779.58136: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22225 1726882779.58182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22225 1726882779.58201: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22225 1726882779.58258: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22225 1726882779.58305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22225 1726882779.58343: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22225 1726882779.58429: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22225 1726882779.58432: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22225 1726882779.58557: variable 'network_connections' from source: task vars 22225 1726882779.58570: variable 'interface' from source: play vars 22225 1726882779.58649: variable 'interface' from source: play vars 22225 1726882779.58773: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22225 1726882779.58972: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22225 1726882779.59227: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22225 1726882779.59230: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22225 1726882779.59232: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22225 1726882779.59236: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22225 1726882779.59238: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22225 1726882779.59240: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22225 1726882779.59281: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22225 1726882779.59340: variable '__network_wireless_connections_defined' from source: role '' defaults 22225 1726882779.59626: variable 'network_connections' from source: task vars 22225 1726882779.59641: variable 'interface' from source: play vars 22225 1726882779.59720: variable 'interface' from source: play vars 22225 1726882779.59762: variable '__network_packages_default_wireless' from source: role '' defaults 22225 1726882779.59853: variable '__network_wireless_connections_defined' from source: role '' defaults 22225 1726882779.60171: variable 'network_connections' from source: task vars 22225 1726882779.60181: variable 'interface' from source: play vars 22225 1726882779.60271: variable 'interface' from source: play vars 22225 1726882779.60298: variable '__network_packages_default_team' from source: role '' defaults 22225 1726882779.60386: variable '__network_team_connections_defined' from source: role '' defaults 22225 1726882779.60772: variable 'network_connections' from source: task vars 22225 1726882779.60783: variable 'interface' from source: play vars 22225 1726882779.60861: variable 'interface' from source: play vars 22225 1726882779.60918: variable '__network_service_name_default_initscripts' from source: role '' defaults 22225 1726882779.60988: variable '__network_service_name_default_initscripts' from source: role '' defaults 22225 1726882779.61000: variable '__network_packages_default_initscripts' from source: role '' defaults 22225 1726882779.61069: variable '__network_packages_default_initscripts' from source: role '' defaults 22225 1726882779.61305: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 22225 1726882779.62107: variable 'network_connections' from source: task vars 22225 1726882779.62426: variable 'interface' from source: play vars 22225 1726882779.62429: variable 'interface' from source: play vars 22225 1726882779.62431: variable 'ansible_distribution' from source: facts 22225 1726882779.62433: variable '__network_rh_distros' from source: role '' defaults 22225 1726882779.62435: variable 'ansible_distribution_major_version' from source: facts 22225 1726882779.62437: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 22225 1726882779.62532: variable 'ansible_distribution' from source: facts 22225 1726882779.62540: variable '__network_rh_distros' from source: role '' defaults 22225 1726882779.62549: variable 'ansible_distribution_major_version' from source: facts 22225 1726882779.62557: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 22225 1726882779.62745: variable 'ansible_distribution' from source: facts 22225 1726882779.62756: variable '__network_rh_distros' from source: role '' defaults 22225 1726882779.62766: variable 'ansible_distribution_major_version' from source: facts 22225 1726882779.62806: variable 'network_provider' from source: set_fact 22225 1726882779.62840: variable 'omit' from source: magic vars 22225 1726882779.62875: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22225 1726882779.62912: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22225 1726882779.62940: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22225 1726882779.62964: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882779.62980: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882779.63016: variable 'inventory_hostname' from source: host vars for 'managed_node1' 22225 1726882779.63028: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882779.63037: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882779.63144: Set connection var ansible_connection to ssh 22225 1726882779.63161: Set connection var ansible_pipelining to False 22225 1726882779.63174: Set connection var ansible_shell_executable to /bin/sh 22225 1726882779.63186: Set connection var ansible_timeout to 10 22225 1726882779.63193: Set connection var ansible_shell_type to sh 22225 1726882779.63204: Set connection var ansible_module_compression to ZIP_DEFLATED 22225 1726882779.63244: variable 'ansible_shell_executable' from source: unknown 22225 1726882779.63252: variable 'ansible_connection' from source: unknown 22225 1726882779.63260: variable 'ansible_module_compression' from source: unknown 22225 1726882779.63327: variable 'ansible_shell_type' from source: unknown 22225 1726882779.63330: variable 'ansible_shell_executable' from source: unknown 22225 1726882779.63333: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882779.63335: variable 'ansible_pipelining' from source: unknown 22225 1726882779.63337: variable 'ansible_timeout' from source: unknown 22225 1726882779.63339: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882779.63413: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22225 1726882779.63435: variable 'omit' from source: magic vars 22225 1726882779.63451: starting attempt loop 22225 1726882779.63458: running the handler 22225 1726882779.63544: variable 'ansible_facts' from source: unknown 22225 1726882779.64461: _low_level_execute_command(): starting 22225 1726882779.64474: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22225 1726882779.65183: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22225 1726882779.65200: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882779.65216: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882779.65239: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882779.65258: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 <<< 22225 1726882779.65271: stderr chunk (state=3): >>>debug2: match not found <<< 22225 1726882779.65286: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882779.65306: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 22225 1726882779.65398: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882779.65426: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882779.65529: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882779.67280: stdout chunk (state=3): >>>/root <<< 22225 1726882779.67439: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882779.67532: stderr chunk (state=3): >>><<< 22225 1726882779.67543: stdout chunk (state=3): >>><<< 22225 1726882779.67568: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882779.67587: _low_level_execute_command(): starting 22225 1726882779.67598: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882779.675745-23458-134541574522787 `" && echo ansible-tmp-1726882779.675745-23458-134541574522787="` echo /root/.ansible/tmp/ansible-tmp-1726882779.675745-23458-134541574522787 `" ) && sleep 0' 22225 1726882779.68197: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22225 1726882779.68213: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882779.68230: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882779.68251: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882779.68270: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 <<< 22225 1726882779.68283: stderr chunk (state=3): >>>debug2: match not found <<< 22225 1726882779.68299: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882779.68342: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882779.68411: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882779.68431: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882779.68452: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882779.68532: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882779.70556: stdout chunk (state=3): >>>ansible-tmp-1726882779.675745-23458-134541574522787=/root/.ansible/tmp/ansible-tmp-1726882779.675745-23458-134541574522787 <<< 22225 1726882779.70671: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882779.70764: stderr chunk (state=3): >>><<< 22225 1726882779.70775: stdout chunk (state=3): >>><<< 22225 1726882779.70800: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882779.675745-23458-134541574522787=/root/.ansible/tmp/ansible-tmp-1726882779.675745-23458-134541574522787 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882779.70846: variable 'ansible_module_compression' from source: unknown 22225 1726882779.70904: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-2222502xecypi/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 22225 1726882779.70978: variable 'ansible_facts' from source: unknown 22225 1726882779.71173: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882779.675745-23458-134541574522787/AnsiballZ_systemd.py 22225 1726882779.71390: Sending initial data 22225 1726882779.71393: Sent initial data (155 bytes) 22225 1726882779.72055: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882779.72140: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882779.72175: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882779.72196: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882779.72221: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882779.72306: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882779.73935: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22225 1726882779.74009: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22225 1726882779.74070: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2222502xecypi/tmp06csbjeg /root/.ansible/tmp/ansible-tmp-1726882779.675745-23458-134541574522787/AnsiballZ_systemd.py <<< 22225 1726882779.74074: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882779.675745-23458-134541574522787/AnsiballZ_systemd.py" <<< 22225 1726882779.74117: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-2222502xecypi/tmp06csbjeg" to remote "/root/.ansible/tmp/ansible-tmp-1726882779.675745-23458-134541574522787/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882779.675745-23458-134541574522787/AnsiballZ_systemd.py" <<< 22225 1726882779.75842: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882779.75857: stderr chunk (state=3): >>><<< 22225 1726882779.75990: stdout chunk (state=3): >>><<< 22225 1726882779.75993: done transferring module to remote 22225 1726882779.75996: _low_level_execute_command(): starting 22225 1726882779.75998: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882779.675745-23458-134541574522787/ /root/.ansible/tmp/ansible-tmp-1726882779.675745-23458-134541574522787/AnsiballZ_systemd.py && sleep 0' 22225 1726882779.76588: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22225 1726882779.76604: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882779.76618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882779.76639: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882779.76674: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882779.76692: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882779.76779: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882779.76802: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882779.76882: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882779.78778: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882779.78782: stdout chunk (state=3): >>><<< 22225 1726882779.79027: stderr chunk (state=3): >>><<< 22225 1726882779.79031: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882779.79034: _low_level_execute_command(): starting 22225 1726882779.79036: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882779.675745-23458-134541574522787/AnsiballZ_systemd.py && sleep 0' 22225 1726882779.79464: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22225 1726882779.79494: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882779.79501: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 <<< 22225 1726882779.79536: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882779.79606: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882779.79641: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882779.79648: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882779.79733: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882780.11747: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "678", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ExecMainStartTimestampMonotonic": "28617093", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "678", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3593", "MemoryCurrent": "12001280", "MemoryPeak": "13942784", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3489005568", "CPUUsageNSec": "1662316000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service multi-user.target shutdown.target network.target cloud-init.service network.service", "After": "basic.target network-pre.target dbus.socket sysinit.target cloud-init-local.service system.slice systemd-journald.socket dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:33:05 EDT", "StateChangeTimestampMonotonic": "343605675", "InactiveExitTimestamp": "Fri 2024-09-20 21:27:50 EDT", "InactiveExitTimestampMonotonic": "28617259", "ActiveEnterTimestamp": "Fri 2024-09-20 21:27:51 EDT", "ActiveEnterTimestampMonotonic": "29575861", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ConditionTimestampMonotonic": "28609732", "AssertTimestamp": "Fri 2024-09-20 21:27:50 EDT", "AssertTimestampMonotonic": "28609736", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "521d937a906d4850835bc71360e9af97", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 22225 1726882780.13682: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. <<< 22225 1726882780.13687: stdout chunk (state=3): >>><<< 22225 1726882780.13690: stderr chunk (state=3): >>><<< 22225 1726882780.13709: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "678", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ExecMainStartTimestampMonotonic": "28617093", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "678", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3593", "MemoryCurrent": "12001280", "MemoryPeak": "13942784", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3489005568", "CPUUsageNSec": "1662316000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service multi-user.target shutdown.target network.target cloud-init.service network.service", "After": "basic.target network-pre.target dbus.socket sysinit.target cloud-init-local.service system.slice systemd-journald.socket dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:33:05 EDT", "StateChangeTimestampMonotonic": "343605675", "InactiveExitTimestamp": "Fri 2024-09-20 21:27:50 EDT", "InactiveExitTimestampMonotonic": "28617259", "ActiveEnterTimestamp": "Fri 2024-09-20 21:27:51 EDT", "ActiveEnterTimestampMonotonic": "29575861", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ConditionTimestampMonotonic": "28609732", "AssertTimestamp": "Fri 2024-09-20 21:27:50 EDT", "AssertTimestampMonotonic": "28609736", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "521d937a906d4850835bc71360e9af97", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. 22225 1726882780.13974: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882779.675745-23458-134541574522787/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22225 1726882780.14028: _low_level_execute_command(): starting 22225 1726882780.14031: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882779.675745-23458-134541574522787/ > /dev/null 2>&1 && sleep 0' 22225 1726882780.14934: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22225 1726882780.14949: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882780.14964: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882780.14987: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882780.15005: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 <<< 22225 1726882780.15028: stderr chunk (state=3): >>>debug2: match not found <<< 22225 1726882780.15046: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882780.15153: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882780.15178: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882780.15205: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882780.15365: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882780.17333: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882780.17345: stdout chunk (state=3): >>><<< 22225 1726882780.17360: stderr chunk (state=3): >>><<< 22225 1726882780.17383: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882780.17397: handler run complete 22225 1726882780.17481: attempt loop complete, returning result 22225 1726882780.17494: _execute() done 22225 1726882780.17501: dumping result to json 22225 1726882780.17527: done dumping result, returning 22225 1726882780.17548: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affc7ec-ae25-ec05-55b7-00000000007a] 22225 1726882780.17559: sending task result for task 0affc7ec-ae25-ec05-55b7-00000000007a ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 22225 1726882780.18682: no more pending results, returning what we have 22225 1726882780.18686: results queue empty 22225 1726882780.18687: checking for any_errors_fatal 22225 1726882780.18691: done checking for any_errors_fatal 22225 1726882780.18691: checking for max_fail_percentage 22225 1726882780.18693: done checking for max_fail_percentage 22225 1726882780.18694: checking to see if all hosts have failed and the running result is not ok 22225 1726882780.18695: done checking to see if all hosts have failed 22225 1726882780.18696: getting the remaining hosts for this loop 22225 1726882780.18697: done getting the remaining hosts for this loop 22225 1726882780.18701: getting the next task for host managed_node1 22225 1726882780.18707: done getting next task for host managed_node1 22225 1726882780.18710: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 22225 1726882780.18713: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22225 1726882780.18726: getting variables 22225 1726882780.18728: in VariableManager get_vars() 22225 1726882780.18764: Calling all_inventory to load vars for managed_node1 22225 1726882780.18767: Calling groups_inventory to load vars for managed_node1 22225 1726882780.18769: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882780.18779: Calling all_plugins_play to load vars for managed_node1 22225 1726882780.18785: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882780.18789: Calling groups_plugins_play to load vars for managed_node1 22225 1726882780.19439: done sending task result for task 0affc7ec-ae25-ec05-55b7-00000000007a 22225 1726882780.19443: WORKER PROCESS EXITING 22225 1726882780.20438: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882780.22711: done with get_vars() 22225 1726882780.22751: done getting variables 22225 1726882780.22818: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:39:40 -0400 (0:00:00.702) 0:00:35.621 ****** 22225 1726882780.22861: entering _queue_task() for managed_node1/service 22225 1726882780.23255: worker is 1 (out of 1 available) 22225 1726882780.23270: exiting _queue_task() for managed_node1/service 22225 1726882780.23400: done queuing things up, now waiting for results queue to drain 22225 1726882780.23402: waiting for pending results... 22225 1726882780.23621: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 22225 1726882780.23787: in run() - task 0affc7ec-ae25-ec05-55b7-00000000007b 22225 1726882780.23809: variable 'ansible_search_path' from source: unknown 22225 1726882780.23817: variable 'ansible_search_path' from source: unknown 22225 1726882780.23870: calling self._execute() 22225 1726882780.23987: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882780.24058: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882780.24062: variable 'omit' from source: magic vars 22225 1726882780.24496: variable 'ansible_distribution_major_version' from source: facts 22225 1726882780.24518: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882780.24647: variable 'network_provider' from source: set_fact 22225 1726882780.24659: Evaluated conditional (network_provider == "nm"): True 22225 1726882780.24764: variable '__network_wpa_supplicant_required' from source: role '' defaults 22225 1726882780.24931: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 22225 1726882780.25076: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22225 1726882780.27790: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22225 1726882780.27877: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22225 1726882780.27926: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22225 1726882780.27974: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22225 1726882780.28012: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22225 1726882780.28131: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22225 1726882780.28167: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22225 1726882780.28305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22225 1726882780.28310: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22225 1726882780.28314: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22225 1726882780.28345: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22225 1726882780.28373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22225 1726882780.28408: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22225 1726882780.28463: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22225 1726882780.28525: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22225 1726882780.28545: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22225 1726882780.28574: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22225 1726882780.28607: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22225 1726882780.28664: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22225 1726882780.28689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22225 1726882780.28928: variable 'network_connections' from source: task vars 22225 1726882780.28931: variable 'interface' from source: play vars 22225 1726882780.28972: variable 'interface' from source: play vars 22225 1726882780.29062: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22225 1726882780.29261: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22225 1726882780.29315: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22225 1726882780.29397: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22225 1726882780.29402: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22225 1726882780.29506: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22225 1726882780.29510: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22225 1726882780.29514: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22225 1726882780.29542: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22225 1726882780.29599: variable '__network_wireless_connections_defined' from source: role '' defaults 22225 1726882780.29911: variable 'network_connections' from source: task vars 22225 1726882780.29939: variable 'interface' from source: play vars 22225 1726882780.30007: variable 'interface' from source: play vars 22225 1726882780.30049: Evaluated conditional (__network_wpa_supplicant_required): False 22225 1726882780.30127: when evaluation is False, skipping this task 22225 1726882780.30131: _execute() done 22225 1726882780.30133: dumping result to json 22225 1726882780.30136: done dumping result, returning 22225 1726882780.30138: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affc7ec-ae25-ec05-55b7-00000000007b] 22225 1726882780.30148: sending task result for task 0affc7ec-ae25-ec05-55b7-00000000007b 22225 1726882780.30236: done sending task result for task 0affc7ec-ae25-ec05-55b7-00000000007b 22225 1726882780.30240: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 22225 1726882780.30298: no more pending results, returning what we have 22225 1726882780.30302: results queue empty 22225 1726882780.30303: checking for any_errors_fatal 22225 1726882780.30330: done checking for any_errors_fatal 22225 1726882780.30331: checking for max_fail_percentage 22225 1726882780.30333: done checking for max_fail_percentage 22225 1726882780.30335: checking to see if all hosts have failed and the running result is not ok 22225 1726882780.30336: done checking to see if all hosts have failed 22225 1726882780.30336: getting the remaining hosts for this loop 22225 1726882780.30338: done getting the remaining hosts for this loop 22225 1726882780.30343: getting the next task for host managed_node1 22225 1726882780.30354: done getting next task for host managed_node1 22225 1726882780.30358: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 22225 1726882780.30361: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22225 1726882780.30383: getting variables 22225 1726882780.30386: in VariableManager get_vars() 22225 1726882780.30642: Calling all_inventory to load vars for managed_node1 22225 1726882780.30645: Calling groups_inventory to load vars for managed_node1 22225 1726882780.30648: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882780.30659: Calling all_plugins_play to load vars for managed_node1 22225 1726882780.30662: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882780.30666: Calling groups_plugins_play to load vars for managed_node1 22225 1726882780.32688: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882780.35803: done with get_vars() 22225 1726882780.36031: done getting variables 22225 1726882780.36104: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:39:40 -0400 (0:00:00.133) 0:00:35.754 ****** 22225 1726882780.36193: entering _queue_task() for managed_node1/service 22225 1726882780.36800: worker is 1 (out of 1 available) 22225 1726882780.36814: exiting _queue_task() for managed_node1/service 22225 1726882780.36835: done queuing things up, now waiting for results queue to drain 22225 1726882780.36837: waiting for pending results... 22225 1726882780.37439: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 22225 1726882780.37446: in run() - task 0affc7ec-ae25-ec05-55b7-00000000007c 22225 1726882780.37449: variable 'ansible_search_path' from source: unknown 22225 1726882780.37451: variable 'ansible_search_path' from source: unknown 22225 1726882780.37455: calling self._execute() 22225 1726882780.37457: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882780.37460: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882780.37464: variable 'omit' from source: magic vars 22225 1726882780.37875: variable 'ansible_distribution_major_version' from source: facts 22225 1726882780.37894: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882780.38028: variable 'network_provider' from source: set_fact 22225 1726882780.38039: Evaluated conditional (network_provider == "initscripts"): False 22225 1726882780.38046: when evaluation is False, skipping this task 22225 1726882780.38053: _execute() done 22225 1726882780.38060: dumping result to json 22225 1726882780.38067: done dumping result, returning 22225 1726882780.38077: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [0affc7ec-ae25-ec05-55b7-00000000007c] 22225 1726882780.38086: sending task result for task 0affc7ec-ae25-ec05-55b7-00000000007c 22225 1726882780.38206: done sending task result for task 0affc7ec-ae25-ec05-55b7-00000000007c skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 22225 1726882780.38270: no more pending results, returning what we have 22225 1726882780.38274: results queue empty 22225 1726882780.38275: checking for any_errors_fatal 22225 1726882780.38285: done checking for any_errors_fatal 22225 1726882780.38286: checking for max_fail_percentage 22225 1726882780.38288: done checking for max_fail_percentage 22225 1726882780.38289: checking to see if all hosts have failed and the running result is not ok 22225 1726882780.38290: done checking to see if all hosts have failed 22225 1726882780.38291: getting the remaining hosts for this loop 22225 1726882780.38293: done getting the remaining hosts for this loop 22225 1726882780.38298: getting the next task for host managed_node1 22225 1726882780.38306: done getting next task for host managed_node1 22225 1726882780.38311: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 22225 1726882780.38314: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22225 1726882780.38441: getting variables 22225 1726882780.38444: in VariableManager get_vars() 22225 1726882780.38498: Calling all_inventory to load vars for managed_node1 22225 1726882780.38501: Calling groups_inventory to load vars for managed_node1 22225 1726882780.38504: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882780.38518: Calling all_plugins_play to load vars for managed_node1 22225 1726882780.38521: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882780.38631: Calling groups_plugins_play to load vars for managed_node1 22225 1726882780.39340: WORKER PROCESS EXITING 22225 1726882780.40320: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882780.43318: done with get_vars() 22225 1726882780.43548: done getting variables 22225 1726882780.43611: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:39:40 -0400 (0:00:00.074) 0:00:35.829 ****** 22225 1726882780.43650: entering _queue_task() for managed_node1/copy 22225 1726882780.44397: worker is 1 (out of 1 available) 22225 1726882780.44414: exiting _queue_task() for managed_node1/copy 22225 1726882780.44430: done queuing things up, now waiting for results queue to drain 22225 1726882780.44432: waiting for pending results... 22225 1726882780.44939: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 22225 1726882780.45631: in run() - task 0affc7ec-ae25-ec05-55b7-00000000007d 22225 1726882780.45635: variable 'ansible_search_path' from source: unknown 22225 1726882780.45638: variable 'ansible_search_path' from source: unknown 22225 1726882780.45640: calling self._execute() 22225 1726882780.45708: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882780.45724: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882780.45744: variable 'omit' from source: magic vars 22225 1726882780.46586: variable 'ansible_distribution_major_version' from source: facts 22225 1726882780.46611: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882780.46864: variable 'network_provider' from source: set_fact 22225 1726882780.46939: Evaluated conditional (network_provider == "initscripts"): False 22225 1726882780.46948: when evaluation is False, skipping this task 22225 1726882780.46956: _execute() done 22225 1726882780.46964: dumping result to json 22225 1726882780.46972: done dumping result, returning 22225 1726882780.46992: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affc7ec-ae25-ec05-55b7-00000000007d] 22225 1726882780.47049: sending task result for task 0affc7ec-ae25-ec05-55b7-00000000007d skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 22225 1726882780.47245: no more pending results, returning what we have 22225 1726882780.47250: results queue empty 22225 1726882780.47251: checking for any_errors_fatal 22225 1726882780.47262: done checking for any_errors_fatal 22225 1726882780.47263: checking for max_fail_percentage 22225 1726882780.47265: done checking for max_fail_percentage 22225 1726882780.47266: checking to see if all hosts have failed and the running result is not ok 22225 1726882780.47267: done checking to see if all hosts have failed 22225 1726882780.47268: getting the remaining hosts for this loop 22225 1726882780.47270: done getting the remaining hosts for this loop 22225 1726882780.47275: getting the next task for host managed_node1 22225 1726882780.47284: done getting next task for host managed_node1 22225 1726882780.47288: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 22225 1726882780.47292: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22225 1726882780.47313: getting variables 22225 1726882780.47315: in VariableManager get_vars() 22225 1726882780.47371: Calling all_inventory to load vars for managed_node1 22225 1726882780.47374: Calling groups_inventory to load vars for managed_node1 22225 1726882780.47376: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882780.47649: Calling all_plugins_play to load vars for managed_node1 22225 1726882780.47653: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882780.47657: Calling groups_plugins_play to load vars for managed_node1 22225 1726882780.48463: done sending task result for task 0affc7ec-ae25-ec05-55b7-00000000007d 22225 1726882780.48468: WORKER PROCESS EXITING 22225 1726882780.50508: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882780.53679: done with get_vars() 22225 1726882780.53721: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:39:40 -0400 (0:00:00.105) 0:00:35.934 ****** 22225 1726882780.54227: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 22225 1726882780.55034: worker is 1 (out of 1 available) 22225 1726882780.55052: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 22225 1726882780.55067: done queuing things up, now waiting for results queue to drain 22225 1726882780.55069: waiting for pending results... 22225 1726882780.55564: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 22225 1726882780.56029: in run() - task 0affc7ec-ae25-ec05-55b7-00000000007e 22225 1726882780.56054: variable 'ansible_search_path' from source: unknown 22225 1726882780.56061: variable 'ansible_search_path' from source: unknown 22225 1726882780.56111: calling self._execute() 22225 1726882780.56528: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882780.56532: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882780.56536: variable 'omit' from source: magic vars 22225 1726882780.57377: variable 'ansible_distribution_major_version' from source: facts 22225 1726882780.57384: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882780.57598: variable 'omit' from source: magic vars 22225 1726882780.57602: variable 'omit' from source: magic vars 22225 1726882780.58004: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22225 1726882780.63767: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22225 1726882780.64130: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22225 1726882780.64134: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22225 1726882780.64142: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22225 1726882780.64175: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22225 1726882780.64274: variable 'network_provider' from source: set_fact 22225 1726882780.64690: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22225 1726882780.64731: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22225 1726882780.65027: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22225 1726882780.65031: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22225 1726882780.65034: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22225 1726882780.65127: variable 'omit' from source: magic vars 22225 1726882780.65471: variable 'omit' from source: magic vars 22225 1726882780.65596: variable 'network_connections' from source: task vars 22225 1726882780.66028: variable 'interface' from source: play vars 22225 1726882780.66031: variable 'interface' from source: play vars 22225 1726882780.66283: variable 'omit' from source: magic vars 22225 1726882780.66301: variable '__lsr_ansible_managed' from source: task vars 22225 1726882780.66371: variable '__lsr_ansible_managed' from source: task vars 22225 1726882780.67127: Loaded config def from plugin (lookup/template) 22225 1726882780.67132: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 22225 1726882780.67164: File lookup term: get_ansible_managed.j2 22225 1726882780.67169: variable 'ansible_search_path' from source: unknown 22225 1726882780.67172: evaluation_path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 22225 1726882780.67189: search_path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 22225 1726882780.67208: variable 'ansible_search_path' from source: unknown 22225 1726882780.77985: variable 'ansible_managed' from source: unknown 22225 1726882780.78304: variable 'omit' from source: magic vars 22225 1726882780.78339: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22225 1726882780.78367: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22225 1726882780.78388: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22225 1726882780.78407: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882780.78417: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882780.78455: variable 'inventory_hostname' from source: host vars for 'managed_node1' 22225 1726882780.78459: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882780.78462: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882780.78663: Set connection var ansible_connection to ssh 22225 1726882780.78673: Set connection var ansible_pipelining to False 22225 1726882780.78683: Set connection var ansible_shell_executable to /bin/sh 22225 1726882780.78690: Set connection var ansible_timeout to 10 22225 1726882780.78693: Set connection var ansible_shell_type to sh 22225 1726882780.78699: Set connection var ansible_module_compression to ZIP_DEFLATED 22225 1726882780.78727: variable 'ansible_shell_executable' from source: unknown 22225 1726882780.78770: variable 'ansible_connection' from source: unknown 22225 1726882780.78774: variable 'ansible_module_compression' from source: unknown 22225 1726882780.78776: variable 'ansible_shell_type' from source: unknown 22225 1726882780.78779: variable 'ansible_shell_executable' from source: unknown 22225 1726882780.78785: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882780.78788: variable 'ansible_pipelining' from source: unknown 22225 1726882780.78793: variable 'ansible_timeout' from source: unknown 22225 1726882780.78797: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882780.79233: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 22225 1726882780.79244: variable 'omit' from source: magic vars 22225 1726882780.79246: starting attempt loop 22225 1726882780.79249: running the handler 22225 1726882780.79251: _low_level_execute_command(): starting 22225 1726882780.79253: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22225 1726882780.80330: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22225 1726882780.80346: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882780.80396: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882780.80414: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882780.80431: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 <<< 22225 1726882780.80504: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882780.80530: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882780.80546: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882780.80568: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882780.80659: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882780.82410: stdout chunk (state=3): >>>/root <<< 22225 1726882780.82885: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882780.82889: stdout chunk (state=3): >>><<< 22225 1726882780.82893: stderr chunk (state=3): >>><<< 22225 1726882780.82896: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882780.82899: _low_level_execute_command(): starting 22225 1726882780.82902: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882780.8278549-23506-80887539287866 `" && echo ansible-tmp-1726882780.8278549-23506-80887539287866="` echo /root/.ansible/tmp/ansible-tmp-1726882780.8278549-23506-80887539287866 `" ) && sleep 0' 22225 1726882780.84061: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22225 1726882780.84070: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882780.84085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882780.84100: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882780.84114: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 <<< 22225 1726882780.84121: stderr chunk (state=3): >>>debug2: match not found <<< 22225 1726882780.84134: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882780.84149: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 22225 1726882780.84157: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.7 is address <<< 22225 1726882780.84164: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 22225 1726882780.84172: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882780.84184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882780.84202: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882780.84209: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 <<< 22225 1726882780.84299: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882780.84560: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882780.84619: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882780.86609: stdout chunk (state=3): >>>ansible-tmp-1726882780.8278549-23506-80887539287866=/root/.ansible/tmp/ansible-tmp-1726882780.8278549-23506-80887539287866 <<< 22225 1726882780.86945: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882780.86950: stdout chunk (state=3): >>><<< 22225 1726882780.87035: stderr chunk (state=3): >>><<< 22225 1726882780.87039: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882780.8278549-23506-80887539287866=/root/.ansible/tmp/ansible-tmp-1726882780.8278549-23506-80887539287866 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882780.87041: variable 'ansible_module_compression' from source: unknown 22225 1726882780.87091: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-2222502xecypi/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 22225 1726882780.87296: variable 'ansible_facts' from source: unknown 22225 1726882780.87410: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882780.8278549-23506-80887539287866/AnsiballZ_network_connections.py 22225 1726882780.87968: Sending initial data 22225 1726882780.87971: Sent initial data (167 bytes) 22225 1726882780.89243: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882780.89251: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882780.89265: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882780.89349: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882780.90967: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22225 1726882780.91025: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22225 1726882780.91080: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2222502xecypi/tmp7cwbh_ek /root/.ansible/tmp/ansible-tmp-1726882780.8278549-23506-80887539287866/AnsiballZ_network_connections.py <<< 22225 1726882780.91090: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882780.8278549-23506-80887539287866/AnsiballZ_network_connections.py" <<< 22225 1726882780.91135: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-2222502xecypi/tmp7cwbh_ek" to remote "/root/.ansible/tmp/ansible-tmp-1726882780.8278549-23506-80887539287866/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882780.8278549-23506-80887539287866/AnsiballZ_network_connections.py" <<< 22225 1726882780.93102: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882780.93240: stderr chunk (state=3): >>><<< 22225 1726882780.93244: stdout chunk (state=3): >>><<< 22225 1726882780.93275: done transferring module to remote 22225 1726882780.93287: _low_level_execute_command(): starting 22225 1726882780.93293: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882780.8278549-23506-80887539287866/ /root/.ansible/tmp/ansible-tmp-1726882780.8278549-23506-80887539287866/AnsiballZ_network_connections.py && sleep 0' 22225 1726882780.94513: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882780.94517: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882780.94639: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882780.94645: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found <<< 22225 1726882780.94659: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882780.94665: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address <<< 22225 1726882780.94670: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 22225 1726882780.94676: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882780.94693: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882780.94699: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882780.95035: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882781.05630: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882781.05717: stderr chunk (state=3): >>><<< 22225 1726882781.05849: stdout chunk (state=3): >>><<< 22225 1726882781.05939: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882781.05943: _low_level_execute_command(): starting 22225 1726882781.05946: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882780.8278549-23506-80887539287866/AnsiballZ_network_connections.py && sleep 0' 22225 1726882781.07240: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22225 1726882781.07325: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882781.07447: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882781.07538: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882781.45852: stdout chunk (state=3): >>>Traceback (most recent call last):<<< 22225 1726882781.45857: stdout chunk (state=3): >>> File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_fu858dft/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_fu858dft/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on veth0/b7f0538a-9cdb-4097-80b7-66d6eec65d0d: error=unknown <<< 22225 1726882781.46050: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 22225 1726882781.48145: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. <<< 22225 1726882781.48215: stderr chunk (state=3): >>><<< 22225 1726882781.48232: stdout chunk (state=3): >>><<< 22225 1726882781.48329: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_fu858dft/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_fu858dft/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on veth0/b7f0538a-9cdb-4097-80b7-66d6eec65d0d: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. 22225 1726882781.48336: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'veth0', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882780.8278549-23506-80887539287866/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22225 1726882781.48339: _low_level_execute_command(): starting 22225 1726882781.48342: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882780.8278549-23506-80887539287866/ > /dev/null 2>&1 && sleep 0' 22225 1726882781.48964: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22225 1726882781.48983: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882781.48999: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882781.49020: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882781.49043: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 <<< 22225 1726882781.49056: stderr chunk (state=3): >>>debug2: match not found <<< 22225 1726882781.49077: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882781.49102: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 22225 1726882781.49116: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.7 is address <<< 22225 1726882781.49130: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 22225 1726882781.49143: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882781.49157: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882781.49241: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882781.49256: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882781.49273: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882781.49298: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882781.49387: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882781.51410: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882781.51419: stdout chunk (state=3): >>><<< 22225 1726882781.51424: stderr chunk (state=3): >>><<< 22225 1726882781.51468: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882781.51474: handler run complete 22225 1726882781.51509: attempt loop complete, returning result 22225 1726882781.51512: _execute() done 22225 1726882781.51515: dumping result to json 22225 1726882781.51519: done dumping result, returning 22225 1726882781.51588: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affc7ec-ae25-ec05-55b7-00000000007e] 22225 1726882781.51591: sending task result for task 0affc7ec-ae25-ec05-55b7-00000000007e 22225 1726882781.51755: done sending task result for task 0affc7ec-ae25-ec05-55b7-00000000007e 22225 1726882781.51759: WORKER PROCESS EXITING changed: [managed_node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "veth0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 22225 1726882781.51960: no more pending results, returning what we have 22225 1726882781.51964: results queue empty 22225 1726882781.51965: checking for any_errors_fatal 22225 1726882781.51972: done checking for any_errors_fatal 22225 1726882781.51973: checking for max_fail_percentage 22225 1726882781.51975: done checking for max_fail_percentage 22225 1726882781.51976: checking to see if all hosts have failed and the running result is not ok 22225 1726882781.51977: done checking to see if all hosts have failed 22225 1726882781.51978: getting the remaining hosts for this loop 22225 1726882781.51980: done getting the remaining hosts for this loop 22225 1726882781.51984: getting the next task for host managed_node1 22225 1726882781.51992: done getting next task for host managed_node1 22225 1726882781.51999: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 22225 1726882781.52002: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22225 1726882781.52014: getting variables 22225 1726882781.52016: in VariableManager get_vars() 22225 1726882781.52516: Calling all_inventory to load vars for managed_node1 22225 1726882781.52519: Calling groups_inventory to load vars for managed_node1 22225 1726882781.52523: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882781.52537: Calling all_plugins_play to load vars for managed_node1 22225 1726882781.52540: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882781.52544: Calling groups_plugins_play to load vars for managed_node1 22225 1726882781.55882: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882781.58776: done with get_vars() 22225 1726882781.58843: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:39:41 -0400 (0:00:01.047) 0:00:36.982 ****** 22225 1726882781.58979: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 22225 1726882781.59443: worker is 1 (out of 1 available) 22225 1726882781.59465: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 22225 1726882781.59486: done queuing things up, now waiting for results queue to drain 22225 1726882781.59487: waiting for pending results... 22225 1726882781.60019: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 22225 1726882781.60028: in run() - task 0affc7ec-ae25-ec05-55b7-00000000007f 22225 1726882781.60047: variable 'ansible_search_path' from source: unknown 22225 1726882781.60056: variable 'ansible_search_path' from source: unknown 22225 1726882781.60103: calling self._execute() 22225 1726882781.60248: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882781.60262: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882781.60301: variable 'omit' from source: magic vars 22225 1726882781.60902: variable 'ansible_distribution_major_version' from source: facts 22225 1726882781.60906: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882781.61020: variable 'network_state' from source: role '' defaults 22225 1726882781.61051: Evaluated conditional (network_state != {}): False 22225 1726882781.61058: when evaluation is False, skipping this task 22225 1726882781.61064: _execute() done 22225 1726882781.61073: dumping result to json 22225 1726882781.61083: done dumping result, returning 22225 1726882781.61098: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [0affc7ec-ae25-ec05-55b7-00000000007f] 22225 1726882781.61110: sending task result for task 0affc7ec-ae25-ec05-55b7-00000000007f skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 22225 1726882781.61326: no more pending results, returning what we have 22225 1726882781.61330: results queue empty 22225 1726882781.61331: checking for any_errors_fatal 22225 1726882781.61344: done checking for any_errors_fatal 22225 1726882781.61345: checking for max_fail_percentage 22225 1726882781.61348: done checking for max_fail_percentage 22225 1726882781.61349: checking to see if all hosts have failed and the running result is not ok 22225 1726882781.61349: done checking to see if all hosts have failed 22225 1726882781.61350: getting the remaining hosts for this loop 22225 1726882781.61352: done getting the remaining hosts for this loop 22225 1726882781.61356: getting the next task for host managed_node1 22225 1726882781.61363: done getting next task for host managed_node1 22225 1726882781.61366: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 22225 1726882781.61370: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22225 1726882781.61391: getting variables 22225 1726882781.61392: in VariableManager get_vars() 22225 1726882781.61639: Calling all_inventory to load vars for managed_node1 22225 1726882781.61642: Calling groups_inventory to load vars for managed_node1 22225 1726882781.61645: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882781.61651: done sending task result for task 0affc7ec-ae25-ec05-55b7-00000000007f 22225 1726882781.61654: WORKER PROCESS EXITING 22225 1726882781.61663: Calling all_plugins_play to load vars for managed_node1 22225 1726882781.61667: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882781.61670: Calling groups_plugins_play to load vars for managed_node1 22225 1726882781.63566: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882781.66406: done with get_vars() 22225 1726882781.66469: done getting variables 22225 1726882781.66562: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:39:41 -0400 (0:00:00.076) 0:00:37.058 ****** 22225 1726882781.66602: entering _queue_task() for managed_node1/debug 22225 1726882781.67103: worker is 1 (out of 1 available) 22225 1726882781.67120: exiting _queue_task() for managed_node1/debug 22225 1726882781.67135: done queuing things up, now waiting for results queue to drain 22225 1726882781.67137: waiting for pending results... 22225 1726882781.67514: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 22225 1726882781.67775: in run() - task 0affc7ec-ae25-ec05-55b7-000000000080 22225 1726882781.67782: variable 'ansible_search_path' from source: unknown 22225 1726882781.67785: variable 'ansible_search_path' from source: unknown 22225 1726882781.67797: calling self._execute() 22225 1726882781.67908: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882781.67920: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882781.68007: variable 'omit' from source: magic vars 22225 1726882781.68411: variable 'ansible_distribution_major_version' from source: facts 22225 1726882781.68431: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882781.68443: variable 'omit' from source: magic vars 22225 1726882781.68511: variable 'omit' from source: magic vars 22225 1726882781.68554: variable 'omit' from source: magic vars 22225 1726882781.68599: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22225 1726882781.68648: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22225 1726882781.68675: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22225 1726882781.68703: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882781.68721: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882781.68764: variable 'inventory_hostname' from source: host vars for 'managed_node1' 22225 1726882781.68773: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882781.68784: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882781.68902: Set connection var ansible_connection to ssh 22225 1726882781.68918: Set connection var ansible_pipelining to False 22225 1726882781.68935: Set connection var ansible_shell_executable to /bin/sh 22225 1726882781.68946: Set connection var ansible_timeout to 10 22225 1726882781.68958: Set connection var ansible_shell_type to sh 22225 1726882781.68969: Set connection var ansible_module_compression to ZIP_DEFLATED 22225 1726882781.69003: variable 'ansible_shell_executable' from source: unknown 22225 1726882781.69013: variable 'ansible_connection' from source: unknown 22225 1726882781.69024: variable 'ansible_module_compression' from source: unknown 22225 1726882781.69033: variable 'ansible_shell_type' from source: unknown 22225 1726882781.69040: variable 'ansible_shell_executable' from source: unknown 22225 1726882781.69047: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882781.69056: variable 'ansible_pipelining' from source: unknown 22225 1726882781.69068: variable 'ansible_timeout' from source: unknown 22225 1726882781.69174: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882781.69247: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22225 1726882781.69267: variable 'omit' from source: magic vars 22225 1726882781.69286: starting attempt loop 22225 1726882781.69294: running the handler 22225 1726882781.69439: variable '__network_connections_result' from source: set_fact 22225 1726882781.69501: handler run complete 22225 1726882781.69530: attempt loop complete, returning result 22225 1726882781.69538: _execute() done 22225 1726882781.69546: dumping result to json 22225 1726882781.69554: done dumping result, returning 22225 1726882781.69567: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affc7ec-ae25-ec05-55b7-000000000080] 22225 1726882781.69578: sending task result for task 0affc7ec-ae25-ec05-55b7-000000000080 ok: [managed_node1] => { "__network_connections_result.stderr_lines": [ "" ] } 22225 1726882781.69789: no more pending results, returning what we have 22225 1726882781.69793: results queue empty 22225 1726882781.69795: checking for any_errors_fatal 22225 1726882781.69802: done checking for any_errors_fatal 22225 1726882781.69803: checking for max_fail_percentage 22225 1726882781.69806: done checking for max_fail_percentage 22225 1726882781.69807: checking to see if all hosts have failed and the running result is not ok 22225 1726882781.69808: done checking to see if all hosts have failed 22225 1726882781.69809: getting the remaining hosts for this loop 22225 1726882781.69811: done getting the remaining hosts for this loop 22225 1726882781.69815: getting the next task for host managed_node1 22225 1726882781.69826: done getting next task for host managed_node1 22225 1726882781.69831: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 22225 1726882781.69834: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22225 1726882781.69846: getting variables 22225 1726882781.69849: in VariableManager get_vars() 22225 1726882781.69897: Calling all_inventory to load vars for managed_node1 22225 1726882781.69901: Calling groups_inventory to load vars for managed_node1 22225 1726882781.69903: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882781.69916: Calling all_plugins_play to load vars for managed_node1 22225 1726882781.69919: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882781.70131: Calling groups_plugins_play to load vars for managed_node1 22225 1726882781.70143: done sending task result for task 0affc7ec-ae25-ec05-55b7-000000000080 22225 1726882781.70147: WORKER PROCESS EXITING 22225 1726882781.71908: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882781.73100: done with get_vars() 22225 1726882781.73120: done getting variables 22225 1726882781.73170: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:39:41 -0400 (0:00:00.065) 0:00:37.124 ****** 22225 1726882781.73197: entering _queue_task() for managed_node1/debug 22225 1726882781.73460: worker is 1 (out of 1 available) 22225 1726882781.73474: exiting _queue_task() for managed_node1/debug 22225 1726882781.73486: done queuing things up, now waiting for results queue to drain 22225 1726882781.73487: waiting for pending results... 22225 1726882781.73737: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 22225 1726882781.73986: in run() - task 0affc7ec-ae25-ec05-55b7-000000000081 22225 1726882781.73990: variable 'ansible_search_path' from source: unknown 22225 1726882781.73993: variable 'ansible_search_path' from source: unknown 22225 1726882781.73996: calling self._execute() 22225 1726882781.74010: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882781.74016: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882781.74228: variable 'omit' from source: magic vars 22225 1726882781.74460: variable 'ansible_distribution_major_version' from source: facts 22225 1726882781.74483: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882781.74486: variable 'omit' from source: magic vars 22225 1726882781.74552: variable 'omit' from source: magic vars 22225 1726882781.74602: variable 'omit' from source: magic vars 22225 1726882781.74648: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22225 1726882781.74695: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22225 1726882781.74718: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22225 1726882781.74737: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882781.74749: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882781.74783: variable 'inventory_hostname' from source: host vars for 'managed_node1' 22225 1726882781.74787: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882781.74790: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882781.74901: Set connection var ansible_connection to ssh 22225 1726882781.74917: Set connection var ansible_pipelining to False 22225 1726882781.74926: Set connection var ansible_shell_executable to /bin/sh 22225 1726882781.74932: Set connection var ansible_timeout to 10 22225 1726882781.74935: Set connection var ansible_shell_type to sh 22225 1726882781.74965: Set connection var ansible_module_compression to ZIP_DEFLATED 22225 1726882781.74980: variable 'ansible_shell_executable' from source: unknown 22225 1726882781.74990: variable 'ansible_connection' from source: unknown 22225 1726882781.74993: variable 'ansible_module_compression' from source: unknown 22225 1726882781.74995: variable 'ansible_shell_type' from source: unknown 22225 1726882781.74998: variable 'ansible_shell_executable' from source: unknown 22225 1726882781.75000: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882781.75002: variable 'ansible_pipelining' from source: unknown 22225 1726882781.75004: variable 'ansible_timeout' from source: unknown 22225 1726882781.75018: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882781.75123: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22225 1726882781.75138: variable 'omit' from source: magic vars 22225 1726882781.75146: starting attempt loop 22225 1726882781.75150: running the handler 22225 1726882781.75191: variable '__network_connections_result' from source: set_fact 22225 1726882781.75252: variable '__network_connections_result' from source: set_fact 22225 1726882781.75335: handler run complete 22225 1726882781.75357: attempt loop complete, returning result 22225 1726882781.75361: _execute() done 22225 1726882781.75363: dumping result to json 22225 1726882781.75366: done dumping result, returning 22225 1726882781.75374: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affc7ec-ae25-ec05-55b7-000000000081] 22225 1726882781.75380: sending task result for task 0affc7ec-ae25-ec05-55b7-000000000081 22225 1726882781.75479: done sending task result for task 0affc7ec-ae25-ec05-55b7-000000000081 22225 1726882781.75482: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "veth0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 22225 1726882781.75577: no more pending results, returning what we have 22225 1726882781.75580: results queue empty 22225 1726882781.75582: checking for any_errors_fatal 22225 1726882781.75588: done checking for any_errors_fatal 22225 1726882781.75589: checking for max_fail_percentage 22225 1726882781.75590: done checking for max_fail_percentage 22225 1726882781.75591: checking to see if all hosts have failed and the running result is not ok 22225 1726882781.75596: done checking to see if all hosts have failed 22225 1726882781.75597: getting the remaining hosts for this loop 22225 1726882781.75599: done getting the remaining hosts for this loop 22225 1726882781.75603: getting the next task for host managed_node1 22225 1726882781.75609: done getting next task for host managed_node1 22225 1726882781.75612: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 22225 1726882781.75615: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22225 1726882781.75626: getting variables 22225 1726882781.75627: in VariableManager get_vars() 22225 1726882781.75667: Calling all_inventory to load vars for managed_node1 22225 1726882781.75670: Calling groups_inventory to load vars for managed_node1 22225 1726882781.75672: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882781.75681: Calling all_plugins_play to load vars for managed_node1 22225 1726882781.75684: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882781.75687: Calling groups_plugins_play to load vars for managed_node1 22225 1726882781.76783: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882781.78727: done with get_vars() 22225 1726882781.78755: done getting variables 22225 1726882781.78816: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:39:41 -0400 (0:00:00.056) 0:00:37.181 ****** 22225 1726882781.78859: entering _queue_task() for managed_node1/debug 22225 1726882781.79210: worker is 1 (out of 1 available) 22225 1726882781.79228: exiting _queue_task() for managed_node1/debug 22225 1726882781.79242: done queuing things up, now waiting for results queue to drain 22225 1726882781.79244: waiting for pending results... 22225 1726882781.79647: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 22225 1726882781.79711: in run() - task 0affc7ec-ae25-ec05-55b7-000000000082 22225 1726882781.79740: variable 'ansible_search_path' from source: unknown 22225 1726882781.79750: variable 'ansible_search_path' from source: unknown 22225 1726882781.79793: calling self._execute() 22225 1726882781.79907: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882781.79920: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882781.79938: variable 'omit' from source: magic vars 22225 1726882781.80370: variable 'ansible_distribution_major_version' from source: facts 22225 1726882781.80393: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882781.80532: variable 'network_state' from source: role '' defaults 22225 1726882781.80548: Evaluated conditional (network_state != {}): False 22225 1726882781.80556: when evaluation is False, skipping this task 22225 1726882781.80563: _execute() done 22225 1726882781.80570: dumping result to json 22225 1726882781.80609: done dumping result, returning 22225 1726882781.80613: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affc7ec-ae25-ec05-55b7-000000000082] 22225 1726882781.80615: sending task result for task 0affc7ec-ae25-ec05-55b7-000000000082 skipping: [managed_node1] => { "false_condition": "network_state != {}" } 22225 1726882781.80774: no more pending results, returning what we have 22225 1726882781.80778: results queue empty 22225 1726882781.80779: checking for any_errors_fatal 22225 1726882781.80792: done checking for any_errors_fatal 22225 1726882781.80793: checking for max_fail_percentage 22225 1726882781.80795: done checking for max_fail_percentage 22225 1726882781.80796: checking to see if all hosts have failed and the running result is not ok 22225 1726882781.80797: done checking to see if all hosts have failed 22225 1726882781.80797: getting the remaining hosts for this loop 22225 1726882781.80799: done getting the remaining hosts for this loop 22225 1726882781.80804: getting the next task for host managed_node1 22225 1726882781.80812: done getting next task for host managed_node1 22225 1726882781.80815: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 22225 1726882781.80819: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22225 1726882781.80844: getting variables 22225 1726882781.80846: in VariableManager get_vars() 22225 1726882781.80889: Calling all_inventory to load vars for managed_node1 22225 1726882781.80892: Calling groups_inventory to load vars for managed_node1 22225 1726882781.80895: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882781.80909: Calling all_plugins_play to load vars for managed_node1 22225 1726882781.80912: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882781.80915: Calling groups_plugins_play to load vars for managed_node1 22225 1726882781.81939: done sending task result for task 0affc7ec-ae25-ec05-55b7-000000000082 22225 1726882781.81943: WORKER PROCESS EXITING 22225 1726882781.82887: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882781.85108: done with get_vars() 22225 1726882781.85136: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:39:41 -0400 (0:00:00.063) 0:00:37.245 ****** 22225 1726882781.85241: entering _queue_task() for managed_node1/ping 22225 1726882781.85588: worker is 1 (out of 1 available) 22225 1726882781.85603: exiting _queue_task() for managed_node1/ping 22225 1726882781.85616: done queuing things up, now waiting for results queue to drain 22225 1726882781.85617: waiting for pending results... 22225 1726882781.85926: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 22225 1726882781.86076: in run() - task 0affc7ec-ae25-ec05-55b7-000000000083 22225 1726882781.86101: variable 'ansible_search_path' from source: unknown 22225 1726882781.86109: variable 'ansible_search_path' from source: unknown 22225 1726882781.86159: calling self._execute() 22225 1726882781.86274: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882781.86287: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882781.86302: variable 'omit' from source: magic vars 22225 1726882781.86725: variable 'ansible_distribution_major_version' from source: facts 22225 1726882781.86745: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882781.86758: variable 'omit' from source: magic vars 22225 1726882781.86836: variable 'omit' from source: magic vars 22225 1726882781.86878: variable 'omit' from source: magic vars 22225 1726882781.86931: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22225 1726882781.86973: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22225 1726882781.87000: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22225 1726882781.87024: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882781.87046: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882781.87083: variable 'inventory_hostname' from source: host vars for 'managed_node1' 22225 1726882781.87092: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882781.87101: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882781.87210: Set connection var ansible_connection to ssh 22225 1726882781.87230: Set connection var ansible_pipelining to False 22225 1726882781.87248: Set connection var ansible_shell_executable to /bin/sh 22225 1726882781.87260: Set connection var ansible_timeout to 10 22225 1726882781.87356: Set connection var ansible_shell_type to sh 22225 1726882781.87360: Set connection var ansible_module_compression to ZIP_DEFLATED 22225 1726882781.87362: variable 'ansible_shell_executable' from source: unknown 22225 1726882781.87364: variable 'ansible_connection' from source: unknown 22225 1726882781.87367: variable 'ansible_module_compression' from source: unknown 22225 1726882781.87369: variable 'ansible_shell_type' from source: unknown 22225 1726882781.87371: variable 'ansible_shell_executable' from source: unknown 22225 1726882781.87374: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882781.87376: variable 'ansible_pipelining' from source: unknown 22225 1726882781.87378: variable 'ansible_timeout' from source: unknown 22225 1726882781.87380: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882781.87577: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 22225 1726882781.87597: variable 'omit' from source: magic vars 22225 1726882781.87608: starting attempt loop 22225 1726882781.87615: running the handler 22225 1726882781.87637: _low_level_execute_command(): starting 22225 1726882781.87648: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22225 1726882781.88728: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22225 1726882781.88745: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882781.88939: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882781.89146: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882781.89312: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882781.90983: stdout chunk (state=3): >>>/root <<< 22225 1726882781.91089: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882781.91293: stderr chunk (state=3): >>><<< 22225 1726882781.91305: stdout chunk (state=3): >>><<< 22225 1726882781.91341: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882781.91397: _low_level_execute_command(): starting 22225 1726882781.91573: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882781.9138124-23547-65549643236954 `" && echo ansible-tmp-1726882781.9138124-23547-65549643236954="` echo /root/.ansible/tmp/ansible-tmp-1726882781.9138124-23547-65549643236954 `" ) && sleep 0' 22225 1726882781.92903: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882781.92907: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882781.92910: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882781.92918: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882781.92943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882781.93094: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882781.93098: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882781.93100: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882781.93351: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882781.95252: stdout chunk (state=3): >>>ansible-tmp-1726882781.9138124-23547-65549643236954=/root/.ansible/tmp/ansible-tmp-1726882781.9138124-23547-65549643236954 <<< 22225 1726882781.95354: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882781.95435: stderr chunk (state=3): >>><<< 22225 1726882781.95438: stdout chunk (state=3): >>><<< 22225 1726882781.95460: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882781.9138124-23547-65549643236954=/root/.ansible/tmp/ansible-tmp-1726882781.9138124-23547-65549643236954 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882781.95631: variable 'ansible_module_compression' from source: unknown 22225 1726882781.95635: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-2222502xecypi/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 22225 1726882781.95638: variable 'ansible_facts' from source: unknown 22225 1726882781.95691: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882781.9138124-23547-65549643236954/AnsiballZ_ping.py 22225 1726882781.95927: Sending initial data 22225 1726882781.95931: Sent initial data (152 bytes) 22225 1726882781.96472: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22225 1726882781.96513: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 22225 1726882781.96605: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.7 is address <<< 22225 1726882781.96624: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882781.96647: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882781.96739: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882781.98640: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 22225 1726882781.98661: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22225 1726882781.98708: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22225 1726882781.98758: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2222502xecypi/tmpfqrmtkjz /root/.ansible/tmp/ansible-tmp-1726882781.9138124-23547-65549643236954/AnsiballZ_ping.py <<< 22225 1726882781.98761: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882781.9138124-23547-65549643236954/AnsiballZ_ping.py" <<< 22225 1726882781.98808: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-2222502xecypi/tmpfqrmtkjz" to remote "/root/.ansible/tmp/ansible-tmp-1726882781.9138124-23547-65549643236954/AnsiballZ_ping.py" <<< 22225 1726882781.98811: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882781.9138124-23547-65549643236954/AnsiballZ_ping.py" <<< 22225 1726882781.99409: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882781.99469: stderr chunk (state=3): >>><<< 22225 1726882781.99475: stdout chunk (state=3): >>><<< 22225 1726882781.99491: done transferring module to remote 22225 1726882781.99504: _low_level_execute_command(): starting 22225 1726882781.99507: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882781.9138124-23547-65549643236954/ /root/.ansible/tmp/ansible-tmp-1726882781.9138124-23547-65549643236954/AnsiballZ_ping.py && sleep 0' 22225 1726882782.00131: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22225 1726882782.00150: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration <<< 22225 1726882782.00241: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882782.00266: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882782.00289: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882782.00314: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882782.00392: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882782.02396: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882782.02442: stderr chunk (state=3): >>><<< 22225 1726882782.02446: stdout chunk (state=3): >>><<< 22225 1726882782.02460: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882782.02463: _low_level_execute_command(): starting 22225 1726882782.02468: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882781.9138124-23547-65549643236954/AnsiballZ_ping.py && sleep 0' 22225 1726882782.02907: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882782.02911: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882782.02914: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882782.02916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882782.02963: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882782.02966: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882782.03033: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882782.19390: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 22225 1726882782.20733: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. <<< 22225 1726882782.20792: stderr chunk (state=3): >>><<< 22225 1726882782.20795: stdout chunk (state=3): >>><<< 22225 1726882782.20810: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. 22225 1726882782.20835: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882781.9138124-23547-65549643236954/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22225 1726882782.20845: _low_level_execute_command(): starting 22225 1726882782.20850: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882781.9138124-23547-65549643236954/ > /dev/null 2>&1 && sleep 0' 22225 1726882782.21317: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882782.21321: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882782.21354: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found <<< 22225 1726882782.21357: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration <<< 22225 1726882782.21360: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882782.21362: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882782.21419: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882782.21425: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882782.21428: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882782.21488: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882782.23388: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882782.23438: stderr chunk (state=3): >>><<< 22225 1726882782.23442: stdout chunk (state=3): >>><<< 22225 1726882782.23460: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882782.23469: handler run complete 22225 1726882782.23482: attempt loop complete, returning result 22225 1726882782.23486: _execute() done 22225 1726882782.23488: dumping result to json 22225 1726882782.23490: done dumping result, returning 22225 1726882782.23497: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affc7ec-ae25-ec05-55b7-000000000083] 22225 1726882782.23502: sending task result for task 0affc7ec-ae25-ec05-55b7-000000000083 22225 1726882782.23601: done sending task result for task 0affc7ec-ae25-ec05-55b7-000000000083 22225 1726882782.23604: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "ping": "pong" } 22225 1726882782.23673: no more pending results, returning what we have 22225 1726882782.23676: results queue empty 22225 1726882782.23677: checking for any_errors_fatal 22225 1726882782.23686: done checking for any_errors_fatal 22225 1726882782.23687: checking for max_fail_percentage 22225 1726882782.23689: done checking for max_fail_percentage 22225 1726882782.23690: checking to see if all hosts have failed and the running result is not ok 22225 1726882782.23691: done checking to see if all hosts have failed 22225 1726882782.23692: getting the remaining hosts for this loop 22225 1726882782.23694: done getting the remaining hosts for this loop 22225 1726882782.23698: getting the next task for host managed_node1 22225 1726882782.23708: done getting next task for host managed_node1 22225 1726882782.23710: ^ task is: TASK: meta (role_complete) 22225 1726882782.23713: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22225 1726882782.23730: getting variables 22225 1726882782.23732: in VariableManager get_vars() 22225 1726882782.23773: Calling all_inventory to load vars for managed_node1 22225 1726882782.23776: Calling groups_inventory to load vars for managed_node1 22225 1726882782.23778: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882782.23790: Calling all_plugins_play to load vars for managed_node1 22225 1726882782.23793: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882782.23795: Calling groups_plugins_play to load vars for managed_node1 22225 1726882782.24804: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882782.25973: done with get_vars() 22225 1726882782.25994: done getting variables 22225 1726882782.26059: done queuing things up, now waiting for results queue to drain 22225 1726882782.26060: results queue empty 22225 1726882782.26061: checking for any_errors_fatal 22225 1726882782.26063: done checking for any_errors_fatal 22225 1726882782.26063: checking for max_fail_percentage 22225 1726882782.26064: done checking for max_fail_percentage 22225 1726882782.26064: checking to see if all hosts have failed and the running result is not ok 22225 1726882782.26065: done checking to see if all hosts have failed 22225 1726882782.26065: getting the remaining hosts for this loop 22225 1726882782.26066: done getting the remaining hosts for this loop 22225 1726882782.26068: getting the next task for host managed_node1 22225 1726882782.26070: done getting next task for host managed_node1 22225 1726882782.26072: ^ task is: TASK: Include the task 'manage_test_interface.yml' 22225 1726882782.26073: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=4, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882782.26075: getting variables 22225 1726882782.26076: in VariableManager get_vars() 22225 1726882782.26088: Calling all_inventory to load vars for managed_node1 22225 1726882782.26089: Calling groups_inventory to load vars for managed_node1 22225 1726882782.26091: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882782.26094: Calling all_plugins_play to load vars for managed_node1 22225 1726882782.26096: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882782.26099: Calling groups_plugins_play to load vars for managed_node1 22225 1726882782.27000: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882782.28159: done with get_vars() 22225 1726882782.28178: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:104 Friday 20 September 2024 21:39:42 -0400 (0:00:00.429) 0:00:37.674 ****** 22225 1726882782.28234: entering _queue_task() for managed_node1/include_tasks 22225 1726882782.28548: worker is 1 (out of 1 available) 22225 1726882782.28565: exiting _queue_task() for managed_node1/include_tasks 22225 1726882782.28578: done queuing things up, now waiting for results queue to drain 22225 1726882782.28583: waiting for pending results... 22225 1726882782.28799: running TaskExecutor() for managed_node1/TASK: Include the task 'manage_test_interface.yml' 22225 1726882782.28884: in run() - task 0affc7ec-ae25-ec05-55b7-0000000000b3 22225 1726882782.28896: variable 'ansible_search_path' from source: unknown 22225 1726882782.28934: calling self._execute() 22225 1726882782.29019: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882782.29027: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882782.29037: variable 'omit' from source: magic vars 22225 1726882782.29339: variable 'ansible_distribution_major_version' from source: facts 22225 1726882782.29351: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882782.29355: _execute() done 22225 1726882782.29358: dumping result to json 22225 1726882782.29362: done dumping result, returning 22225 1726882782.29371: done running TaskExecutor() for managed_node1/TASK: Include the task 'manage_test_interface.yml' [0affc7ec-ae25-ec05-55b7-0000000000b3] 22225 1726882782.29375: sending task result for task 0affc7ec-ae25-ec05-55b7-0000000000b3 22225 1726882782.29514: no more pending results, returning what we have 22225 1726882782.29520: in VariableManager get_vars() 22225 1726882782.29570: Calling all_inventory to load vars for managed_node1 22225 1726882782.29573: Calling groups_inventory to load vars for managed_node1 22225 1726882782.29575: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882782.29593: Calling all_plugins_play to load vars for managed_node1 22225 1726882782.29597: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882782.29600: Calling groups_plugins_play to load vars for managed_node1 22225 1726882782.30598: done sending task result for task 0affc7ec-ae25-ec05-55b7-0000000000b3 22225 1726882782.30602: WORKER PROCESS EXITING 22225 1726882782.30615: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882782.31864: done with get_vars() 22225 1726882782.31881: variable 'ansible_search_path' from source: unknown 22225 1726882782.31895: we have included files to process 22225 1726882782.31896: generating all_blocks data 22225 1726882782.31898: done generating all_blocks data 22225 1726882782.31901: processing included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 22225 1726882782.31902: loading included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 22225 1726882782.31904: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 22225 1726882782.32180: in VariableManager get_vars() 22225 1726882782.32198: done with get_vars() 22225 1726882782.32667: done processing included file 22225 1726882782.32668: iterating over new_blocks loaded from include file 22225 1726882782.32669: in VariableManager get_vars() 22225 1726882782.32683: done with get_vars() 22225 1726882782.32684: filtering new block on tags 22225 1726882782.32706: done filtering new block on tags 22225 1726882782.32708: done iterating over new_blocks loaded from include file included: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed_node1 22225 1726882782.32712: extending task lists for all hosts with included blocks 22225 1726882782.34201: done extending task lists 22225 1726882782.34202: done processing included files 22225 1726882782.34202: results queue empty 22225 1726882782.34203: checking for any_errors_fatal 22225 1726882782.34204: done checking for any_errors_fatal 22225 1726882782.34204: checking for max_fail_percentage 22225 1726882782.34205: done checking for max_fail_percentage 22225 1726882782.34206: checking to see if all hosts have failed and the running result is not ok 22225 1726882782.34206: done checking to see if all hosts have failed 22225 1726882782.34207: getting the remaining hosts for this loop 22225 1726882782.34208: done getting the remaining hosts for this loop 22225 1726882782.34209: getting the next task for host managed_node1 22225 1726882782.34212: done getting next task for host managed_node1 22225 1726882782.34213: ^ task is: TASK: Ensure state in ["present", "absent"] 22225 1726882782.34215: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22225 1726882782.34217: getting variables 22225 1726882782.34218: in VariableManager get_vars() 22225 1726882782.34229: Calling all_inventory to load vars for managed_node1 22225 1726882782.34230: Calling groups_inventory to load vars for managed_node1 22225 1726882782.34232: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882782.34237: Calling all_plugins_play to load vars for managed_node1 22225 1726882782.34238: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882782.34240: Calling groups_plugins_play to load vars for managed_node1 22225 1726882782.35051: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882782.36194: done with get_vars() 22225 1726882782.36213: done getting variables 22225 1726882782.36248: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Friday 20 September 2024 21:39:42 -0400 (0:00:00.080) 0:00:37.755 ****** 22225 1726882782.36271: entering _queue_task() for managed_node1/fail 22225 1726882782.36553: worker is 1 (out of 1 available) 22225 1726882782.36569: exiting _queue_task() for managed_node1/fail 22225 1726882782.36580: done queuing things up, now waiting for results queue to drain 22225 1726882782.36581: waiting for pending results... 22225 1726882782.36775: running TaskExecutor() for managed_node1/TASK: Ensure state in ["present", "absent"] 22225 1726882782.36866: in run() - task 0affc7ec-ae25-ec05-55b7-0000000005cc 22225 1726882782.36878: variable 'ansible_search_path' from source: unknown 22225 1726882782.36881: variable 'ansible_search_path' from source: unknown 22225 1726882782.36921: calling self._execute() 22225 1726882782.37000: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882782.37004: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882782.37014: variable 'omit' from source: magic vars 22225 1726882782.37323: variable 'ansible_distribution_major_version' from source: facts 22225 1726882782.37334: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882782.37438: variable 'state' from source: include params 22225 1726882782.37443: Evaluated conditional (state not in ["present", "absent"]): False 22225 1726882782.37446: when evaluation is False, skipping this task 22225 1726882782.37449: _execute() done 22225 1726882782.37452: dumping result to json 22225 1726882782.37454: done dumping result, returning 22225 1726882782.37464: done running TaskExecutor() for managed_node1/TASK: Ensure state in ["present", "absent"] [0affc7ec-ae25-ec05-55b7-0000000005cc] 22225 1726882782.37466: sending task result for task 0affc7ec-ae25-ec05-55b7-0000000005cc 22225 1726882782.37566: done sending task result for task 0affc7ec-ae25-ec05-55b7-0000000005cc 22225 1726882782.37570: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 22225 1726882782.37618: no more pending results, returning what we have 22225 1726882782.37625: results queue empty 22225 1726882782.37626: checking for any_errors_fatal 22225 1726882782.37628: done checking for any_errors_fatal 22225 1726882782.37629: checking for max_fail_percentage 22225 1726882782.37631: done checking for max_fail_percentage 22225 1726882782.37632: checking to see if all hosts have failed and the running result is not ok 22225 1726882782.37633: done checking to see if all hosts have failed 22225 1726882782.37634: getting the remaining hosts for this loop 22225 1726882782.37636: done getting the remaining hosts for this loop 22225 1726882782.37641: getting the next task for host managed_node1 22225 1726882782.37648: done getting next task for host managed_node1 22225 1726882782.37651: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 22225 1726882782.37655: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22225 1726882782.37659: getting variables 22225 1726882782.37660: in VariableManager get_vars() 22225 1726882782.37703: Calling all_inventory to load vars for managed_node1 22225 1726882782.37706: Calling groups_inventory to load vars for managed_node1 22225 1726882782.37708: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882782.37719: Calling all_plugins_play to load vars for managed_node1 22225 1726882782.37732: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882782.37737: Calling groups_plugins_play to load vars for managed_node1 22225 1726882782.42069: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882782.43212: done with get_vars() 22225 1726882782.43232: done getting variables 22225 1726882782.43268: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Friday 20 September 2024 21:39:42 -0400 (0:00:00.070) 0:00:37.825 ****** 22225 1726882782.43290: entering _queue_task() for managed_node1/fail 22225 1726882782.43569: worker is 1 (out of 1 available) 22225 1726882782.43584: exiting _queue_task() for managed_node1/fail 22225 1726882782.43596: done queuing things up, now waiting for results queue to drain 22225 1726882782.43597: waiting for pending results... 22225 1726882782.43788: running TaskExecutor() for managed_node1/TASK: Ensure type in ["dummy", "tap", "veth"] 22225 1726882782.43876: in run() - task 0affc7ec-ae25-ec05-55b7-0000000005cd 22225 1726882782.43890: variable 'ansible_search_path' from source: unknown 22225 1726882782.43895: variable 'ansible_search_path' from source: unknown 22225 1726882782.43929: calling self._execute() 22225 1726882782.44011: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882782.44016: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882782.44028: variable 'omit' from source: magic vars 22225 1726882782.44337: variable 'ansible_distribution_major_version' from source: facts 22225 1726882782.44348: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882782.44453: variable 'type' from source: play vars 22225 1726882782.44457: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 22225 1726882782.44459: when evaluation is False, skipping this task 22225 1726882782.44462: _execute() done 22225 1726882782.44465: dumping result to json 22225 1726882782.44469: done dumping result, returning 22225 1726882782.44475: done running TaskExecutor() for managed_node1/TASK: Ensure type in ["dummy", "tap", "veth"] [0affc7ec-ae25-ec05-55b7-0000000005cd] 22225 1726882782.44486: sending task result for task 0affc7ec-ae25-ec05-55b7-0000000005cd 22225 1726882782.44573: done sending task result for task 0affc7ec-ae25-ec05-55b7-0000000005cd 22225 1726882782.44576: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 22225 1726882782.44629: no more pending results, returning what we have 22225 1726882782.44632: results queue empty 22225 1726882782.44633: checking for any_errors_fatal 22225 1726882782.44644: done checking for any_errors_fatal 22225 1726882782.44644: checking for max_fail_percentage 22225 1726882782.44646: done checking for max_fail_percentage 22225 1726882782.44647: checking to see if all hosts have failed and the running result is not ok 22225 1726882782.44648: done checking to see if all hosts have failed 22225 1726882782.44649: getting the remaining hosts for this loop 22225 1726882782.44651: done getting the remaining hosts for this loop 22225 1726882782.44654: getting the next task for host managed_node1 22225 1726882782.44661: done getting next task for host managed_node1 22225 1726882782.44664: ^ task is: TASK: Include the task 'show_interfaces.yml' 22225 1726882782.44667: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22225 1726882782.44670: getting variables 22225 1726882782.44672: in VariableManager get_vars() 22225 1726882782.44712: Calling all_inventory to load vars for managed_node1 22225 1726882782.44715: Calling groups_inventory to load vars for managed_node1 22225 1726882782.44717: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882782.44730: Calling all_plugins_play to load vars for managed_node1 22225 1726882782.44732: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882782.44735: Calling groups_plugins_play to load vars for managed_node1 22225 1726882782.45756: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882782.46931: done with get_vars() 22225 1726882782.46947: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Friday 20 September 2024 21:39:42 -0400 (0:00:00.037) 0:00:37.862 ****** 22225 1726882782.47020: entering _queue_task() for managed_node1/include_tasks 22225 1726882782.47252: worker is 1 (out of 1 available) 22225 1726882782.47268: exiting _queue_task() for managed_node1/include_tasks 22225 1726882782.47279: done queuing things up, now waiting for results queue to drain 22225 1726882782.47281: waiting for pending results... 22225 1726882782.47462: running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' 22225 1726882782.47547: in run() - task 0affc7ec-ae25-ec05-55b7-0000000005ce 22225 1726882782.47557: variable 'ansible_search_path' from source: unknown 22225 1726882782.47561: variable 'ansible_search_path' from source: unknown 22225 1726882782.47594: calling self._execute() 22225 1726882782.47677: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882782.47681: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882782.47694: variable 'omit' from source: magic vars 22225 1726882782.48005: variable 'ansible_distribution_major_version' from source: facts 22225 1726882782.48014: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882782.48021: _execute() done 22225 1726882782.48025: dumping result to json 22225 1726882782.48028: done dumping result, returning 22225 1726882782.48035: done running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' [0affc7ec-ae25-ec05-55b7-0000000005ce] 22225 1726882782.48042: sending task result for task 0affc7ec-ae25-ec05-55b7-0000000005ce 22225 1726882782.48140: done sending task result for task 0affc7ec-ae25-ec05-55b7-0000000005ce 22225 1726882782.48143: WORKER PROCESS EXITING 22225 1726882782.48180: no more pending results, returning what we have 22225 1726882782.48186: in VariableManager get_vars() 22225 1726882782.48233: Calling all_inventory to load vars for managed_node1 22225 1726882782.48236: Calling groups_inventory to load vars for managed_node1 22225 1726882782.48238: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882782.48250: Calling all_plugins_play to load vars for managed_node1 22225 1726882782.48252: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882782.48255: Calling groups_plugins_play to load vars for managed_node1 22225 1726882782.49209: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882782.50389: done with get_vars() 22225 1726882782.50405: variable 'ansible_search_path' from source: unknown 22225 1726882782.50406: variable 'ansible_search_path' from source: unknown 22225 1726882782.50437: we have included files to process 22225 1726882782.50438: generating all_blocks data 22225 1726882782.50440: done generating all_blocks data 22225 1726882782.50444: processing included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 22225 1726882782.50445: loading included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 22225 1726882782.50447: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 22225 1726882782.50525: in VariableManager get_vars() 22225 1726882782.50542: done with get_vars() 22225 1726882782.50627: done processing included file 22225 1726882782.50629: iterating over new_blocks loaded from include file 22225 1726882782.50630: in VariableManager get_vars() 22225 1726882782.50643: done with get_vars() 22225 1726882782.50644: filtering new block on tags 22225 1726882782.50656: done filtering new block on tags 22225 1726882782.50658: done iterating over new_blocks loaded from include file included: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node1 22225 1726882782.50661: extending task lists for all hosts with included blocks 22225 1726882782.50936: done extending task lists 22225 1726882782.50938: done processing included files 22225 1726882782.50938: results queue empty 22225 1726882782.50939: checking for any_errors_fatal 22225 1726882782.50941: done checking for any_errors_fatal 22225 1726882782.50942: checking for max_fail_percentage 22225 1726882782.50942: done checking for max_fail_percentage 22225 1726882782.50943: checking to see if all hosts have failed and the running result is not ok 22225 1726882782.50944: done checking to see if all hosts have failed 22225 1726882782.50944: getting the remaining hosts for this loop 22225 1726882782.50945: done getting the remaining hosts for this loop 22225 1726882782.50947: getting the next task for host managed_node1 22225 1726882782.50949: done getting next task for host managed_node1 22225 1726882782.50951: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 22225 1726882782.50953: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22225 1726882782.50955: getting variables 22225 1726882782.50955: in VariableManager get_vars() 22225 1726882782.50964: Calling all_inventory to load vars for managed_node1 22225 1726882782.50966: Calling groups_inventory to load vars for managed_node1 22225 1726882782.50967: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882782.50971: Calling all_plugins_play to load vars for managed_node1 22225 1726882782.50973: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882782.50975: Calling groups_plugins_play to load vars for managed_node1 22225 1726882782.51891: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882782.53054: done with get_vars() 22225 1726882782.53073: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 21:39:42 -0400 (0:00:00.061) 0:00:37.924 ****** 22225 1726882782.53138: entering _queue_task() for managed_node1/include_tasks 22225 1726882782.53414: worker is 1 (out of 1 available) 22225 1726882782.53431: exiting _queue_task() for managed_node1/include_tasks 22225 1726882782.53443: done queuing things up, now waiting for results queue to drain 22225 1726882782.53444: waiting for pending results... 22225 1726882782.53648: running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' 22225 1726882782.53744: in run() - task 0affc7ec-ae25-ec05-55b7-0000000006e4 22225 1726882782.53760: variable 'ansible_search_path' from source: unknown 22225 1726882782.53764: variable 'ansible_search_path' from source: unknown 22225 1726882782.53795: calling self._execute() 22225 1726882782.53876: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882782.53883: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882782.53889: variable 'omit' from source: magic vars 22225 1726882782.54199: variable 'ansible_distribution_major_version' from source: facts 22225 1726882782.54209: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882782.54216: _execute() done 22225 1726882782.54220: dumping result to json 22225 1726882782.54223: done dumping result, returning 22225 1726882782.54234: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' [0affc7ec-ae25-ec05-55b7-0000000006e4] 22225 1726882782.54237: sending task result for task 0affc7ec-ae25-ec05-55b7-0000000006e4 22225 1726882782.54337: done sending task result for task 0affc7ec-ae25-ec05-55b7-0000000006e4 22225 1726882782.54339: WORKER PROCESS EXITING 22225 1726882782.54370: no more pending results, returning what we have 22225 1726882782.54376: in VariableManager get_vars() 22225 1726882782.54427: Calling all_inventory to load vars for managed_node1 22225 1726882782.54430: Calling groups_inventory to load vars for managed_node1 22225 1726882782.54432: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882782.54454: Calling all_plugins_play to load vars for managed_node1 22225 1726882782.54457: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882782.54461: Calling groups_plugins_play to load vars for managed_node1 22225 1726882782.55531: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882782.56716: done with get_vars() 22225 1726882782.56737: variable 'ansible_search_path' from source: unknown 22225 1726882782.56738: variable 'ansible_search_path' from source: unknown 22225 1726882782.56785: we have included files to process 22225 1726882782.56785: generating all_blocks data 22225 1726882782.56787: done generating all_blocks data 22225 1726882782.56788: processing included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 22225 1726882782.56789: loading included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 22225 1726882782.56790: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 22225 1726882782.56991: done processing included file 22225 1726882782.56993: iterating over new_blocks loaded from include file 22225 1726882782.56994: in VariableManager get_vars() 22225 1726882782.57008: done with get_vars() 22225 1726882782.57009: filtering new block on tags 22225 1726882782.57024: done filtering new block on tags 22225 1726882782.57026: done iterating over new_blocks loaded from include file included: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node1 22225 1726882782.57029: extending task lists for all hosts with included blocks 22225 1726882782.57135: done extending task lists 22225 1726882782.57136: done processing included files 22225 1726882782.57136: results queue empty 22225 1726882782.57137: checking for any_errors_fatal 22225 1726882782.57139: done checking for any_errors_fatal 22225 1726882782.57140: checking for max_fail_percentage 22225 1726882782.57140: done checking for max_fail_percentage 22225 1726882782.57141: checking to see if all hosts have failed and the running result is not ok 22225 1726882782.57141: done checking to see if all hosts have failed 22225 1726882782.57142: getting the remaining hosts for this loop 22225 1726882782.57143: done getting the remaining hosts for this loop 22225 1726882782.57144: getting the next task for host managed_node1 22225 1726882782.57147: done getting next task for host managed_node1 22225 1726882782.57149: ^ task is: TASK: Gather current interface info 22225 1726882782.57152: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22225 1726882782.57153: getting variables 22225 1726882782.57154: in VariableManager get_vars() 22225 1726882782.57163: Calling all_inventory to load vars for managed_node1 22225 1726882782.57164: Calling groups_inventory to load vars for managed_node1 22225 1726882782.57165: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882782.57170: Calling all_plugins_play to load vars for managed_node1 22225 1726882782.57172: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882782.57175: Calling groups_plugins_play to load vars for managed_node1 22225 1726882782.58016: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882782.59248: done with get_vars() 22225 1726882782.59267: done getting variables 22225 1726882782.59302: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 21:39:42 -0400 (0:00:00.061) 0:00:37.985 ****** 22225 1726882782.59329: entering _queue_task() for managed_node1/command 22225 1726882782.59610: worker is 1 (out of 1 available) 22225 1726882782.59625: exiting _queue_task() for managed_node1/command 22225 1726882782.59639: done queuing things up, now waiting for results queue to drain 22225 1726882782.59641: waiting for pending results... 22225 1726882782.59833: running TaskExecutor() for managed_node1/TASK: Gather current interface info 22225 1726882782.59938: in run() - task 0affc7ec-ae25-ec05-55b7-00000000071b 22225 1726882782.59951: variable 'ansible_search_path' from source: unknown 22225 1726882782.59954: variable 'ansible_search_path' from source: unknown 22225 1726882782.59989: calling self._execute() 22225 1726882782.60065: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882782.60069: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882782.60084: variable 'omit' from source: magic vars 22225 1726882782.60387: variable 'ansible_distribution_major_version' from source: facts 22225 1726882782.60395: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882782.60403: variable 'omit' from source: magic vars 22225 1726882782.60445: variable 'omit' from source: magic vars 22225 1726882782.60472: variable 'omit' from source: magic vars 22225 1726882782.60509: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22225 1726882782.60543: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22225 1726882782.60560: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22225 1726882782.60574: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882782.60585: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882782.60610: variable 'inventory_hostname' from source: host vars for 'managed_node1' 22225 1726882782.60613: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882782.60616: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882782.60693: Set connection var ansible_connection to ssh 22225 1726882782.60702: Set connection var ansible_pipelining to False 22225 1726882782.60709: Set connection var ansible_shell_executable to /bin/sh 22225 1726882782.60714: Set connection var ansible_timeout to 10 22225 1726882782.60717: Set connection var ansible_shell_type to sh 22225 1726882782.60724: Set connection var ansible_module_compression to ZIP_DEFLATED 22225 1726882782.60746: variable 'ansible_shell_executable' from source: unknown 22225 1726882782.60749: variable 'ansible_connection' from source: unknown 22225 1726882782.60754: variable 'ansible_module_compression' from source: unknown 22225 1726882782.60757: variable 'ansible_shell_type' from source: unknown 22225 1726882782.60759: variable 'ansible_shell_executable' from source: unknown 22225 1726882782.60761: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882782.60764: variable 'ansible_pipelining' from source: unknown 22225 1726882782.60766: variable 'ansible_timeout' from source: unknown 22225 1726882782.60769: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882782.60885: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22225 1726882782.60895: variable 'omit' from source: magic vars 22225 1726882782.60898: starting attempt loop 22225 1726882782.60900: running the handler 22225 1726882782.60915: _low_level_execute_command(): starting 22225 1726882782.60923: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22225 1726882782.61478: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882782.61482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882782.61487: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882782.61490: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882782.61533: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882782.61549: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882782.61611: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882782.63357: stdout chunk (state=3): >>>/root <<< 22225 1726882782.63468: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882782.63525: stderr chunk (state=3): >>><<< 22225 1726882782.63529: stdout chunk (state=3): >>><<< 22225 1726882782.63553: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882782.63566: _low_level_execute_command(): starting 22225 1726882782.63572: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882782.6355157-23571-13595679207613 `" && echo ansible-tmp-1726882782.6355157-23571-13595679207613="` echo /root/.ansible/tmp/ansible-tmp-1726882782.6355157-23571-13595679207613 `" ) && sleep 0' 22225 1726882782.64057: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882782.64062: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882782.64065: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882782.64074: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration <<< 22225 1726882782.64077: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882782.64082: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882782.64118: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882782.64125: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882782.64188: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882782.66156: stdout chunk (state=3): >>>ansible-tmp-1726882782.6355157-23571-13595679207613=/root/.ansible/tmp/ansible-tmp-1726882782.6355157-23571-13595679207613 <<< 22225 1726882782.66277: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882782.66334: stderr chunk (state=3): >>><<< 22225 1726882782.66337: stdout chunk (state=3): >>><<< 22225 1726882782.66353: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882782.6355157-23571-13595679207613=/root/.ansible/tmp/ansible-tmp-1726882782.6355157-23571-13595679207613 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882782.66384: variable 'ansible_module_compression' from source: unknown 22225 1726882782.66429: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-2222502xecypi/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 22225 1726882782.66465: variable 'ansible_facts' from source: unknown 22225 1726882782.66516: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882782.6355157-23571-13595679207613/AnsiballZ_command.py 22225 1726882782.66627: Sending initial data 22225 1726882782.66631: Sent initial data (155 bytes) 22225 1726882782.67109: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882782.67112: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882782.67115: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882782.67118: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882782.67162: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882782.67165: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882782.67228: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882782.68812: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 22225 1726882782.68824: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22225 1726882782.68861: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22225 1726882782.68912: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2222502xecypi/tmpi9l5gjd1 /root/.ansible/tmp/ansible-tmp-1726882782.6355157-23571-13595679207613/AnsiballZ_command.py <<< 22225 1726882782.68920: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882782.6355157-23571-13595679207613/AnsiballZ_command.py" <<< 22225 1726882782.68966: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-2222502xecypi/tmpi9l5gjd1" to remote "/root/.ansible/tmp/ansible-tmp-1726882782.6355157-23571-13595679207613/AnsiballZ_command.py" <<< 22225 1726882782.68970: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882782.6355157-23571-13595679207613/AnsiballZ_command.py" <<< 22225 1726882782.69539: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882782.69610: stderr chunk (state=3): >>><<< 22225 1726882782.69614: stdout chunk (state=3): >>><<< 22225 1726882782.69636: done transferring module to remote 22225 1726882782.69646: _low_level_execute_command(): starting 22225 1726882782.69652: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882782.6355157-23571-13595679207613/ /root/.ansible/tmp/ansible-tmp-1726882782.6355157-23571-13595679207613/AnsiballZ_command.py && sleep 0' 22225 1726882782.70108: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882782.70149: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882782.70153: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found <<< 22225 1726882782.70155: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882782.70158: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882782.70164: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882782.70209: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882782.70216: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882782.70218: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882782.70268: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882782.72066: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882782.72116: stderr chunk (state=3): >>><<< 22225 1726882782.72119: stdout chunk (state=3): >>><<< 22225 1726882782.72137: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882782.72140: _low_level_execute_command(): starting 22225 1726882782.72144: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882782.6355157-23571-13595679207613/AnsiballZ_command.py && sleep 0' 22225 1726882782.72583: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882782.72587: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882782.72617: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found <<< 22225 1726882782.72620: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration <<< 22225 1726882782.72625: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 <<< 22225 1726882782.72627: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882782.72687: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882782.72690: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882782.72692: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882782.72748: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882782.89674: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo\nveth0", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:39:42.891336", "end": "2024-09-20 21:39:42.894949", "delta": "0:00:00.003613", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 22225 1726882782.91263: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. <<< 22225 1726882782.91326: stderr chunk (state=3): >>><<< 22225 1726882782.91329: stdout chunk (state=3): >>><<< 22225 1726882782.91351: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo\nveth0", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:39:42.891336", "end": "2024-09-20 21:39:42.894949", "delta": "0:00:00.003613", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. 22225 1726882782.91381: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882782.6355157-23571-13595679207613/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22225 1726882782.91388: _low_level_execute_command(): starting 22225 1726882782.91395: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882782.6355157-23571-13595679207613/ > /dev/null 2>&1 && sleep 0' 22225 1726882782.91884: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882782.91888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found <<< 22225 1726882782.91896: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882782.91899: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882782.91901: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882782.91961: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882782.91963: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882782.91965: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882782.92010: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882782.93898: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882782.93947: stderr chunk (state=3): >>><<< 22225 1726882782.93951: stdout chunk (state=3): >>><<< 22225 1726882782.93965: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882782.93971: handler run complete 22225 1726882782.93996: Evaluated conditional (False): False 22225 1726882782.94005: attempt loop complete, returning result 22225 1726882782.94008: _execute() done 22225 1726882782.94015: dumping result to json 22225 1726882782.94018: done dumping result, returning 22225 1726882782.94029: done running TaskExecutor() for managed_node1/TASK: Gather current interface info [0affc7ec-ae25-ec05-55b7-00000000071b] 22225 1726882782.94034: sending task result for task 0affc7ec-ae25-ec05-55b7-00000000071b 22225 1726882782.94210: done sending task result for task 0affc7ec-ae25-ec05-55b7-00000000071b 22225 1726882782.94213: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003613", "end": "2024-09-20 21:39:42.894949", "rc": 0, "start": "2024-09-20 21:39:42.891336" } STDOUT: bonding_masters eth0 lo veth0 22225 1726882782.94503: no more pending results, returning what we have 22225 1726882782.94507: results queue empty 22225 1726882782.94508: checking for any_errors_fatal 22225 1726882782.94509: done checking for any_errors_fatal 22225 1726882782.94510: checking for max_fail_percentage 22225 1726882782.94512: done checking for max_fail_percentage 22225 1726882782.94513: checking to see if all hosts have failed and the running result is not ok 22225 1726882782.94514: done checking to see if all hosts have failed 22225 1726882782.94515: getting the remaining hosts for this loop 22225 1726882782.94516: done getting the remaining hosts for this loop 22225 1726882782.94521: getting the next task for host managed_node1 22225 1726882782.94530: done getting next task for host managed_node1 22225 1726882782.94534: ^ task is: TASK: Set current_interfaces 22225 1726882782.94540: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22225 1726882782.94545: getting variables 22225 1726882782.94546: in VariableManager get_vars() 22225 1726882782.94588: Calling all_inventory to load vars for managed_node1 22225 1726882782.94591: Calling groups_inventory to load vars for managed_node1 22225 1726882782.94594: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882782.94605: Calling all_plugins_play to load vars for managed_node1 22225 1726882782.94607: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882782.94611: Calling groups_plugins_play to load vars for managed_node1 22225 1726882782.96295: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882782.98416: done with get_vars() 22225 1726882782.98452: done getting variables 22225 1726882782.98523: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 21:39:42 -0400 (0:00:00.392) 0:00:38.378 ****** 22225 1726882782.98563: entering _queue_task() for managed_node1/set_fact 22225 1726882782.98875: worker is 1 (out of 1 available) 22225 1726882782.98890: exiting _queue_task() for managed_node1/set_fact 22225 1726882782.98907: done queuing things up, now waiting for results queue to drain 22225 1726882782.98909: waiting for pending results... 22225 1726882782.99214: running TaskExecutor() for managed_node1/TASK: Set current_interfaces 22225 1726882782.99307: in run() - task 0affc7ec-ae25-ec05-55b7-00000000071c 22225 1726882782.99319: variable 'ansible_search_path' from source: unknown 22225 1726882782.99325: variable 'ansible_search_path' from source: unknown 22225 1726882782.99356: calling self._execute() 22225 1726882782.99443: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882782.99448: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882782.99458: variable 'omit' from source: magic vars 22225 1726882782.99771: variable 'ansible_distribution_major_version' from source: facts 22225 1726882782.99784: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882782.99788: variable 'omit' from source: magic vars 22225 1726882782.99832: variable 'omit' from source: magic vars 22225 1726882782.99908: variable '_current_interfaces' from source: set_fact 22225 1726882782.99963: variable 'omit' from source: magic vars 22225 1726882783.00000: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22225 1726882783.00032: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22225 1726882783.00047: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22225 1726882783.00062: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882783.00072: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882783.00097: variable 'inventory_hostname' from source: host vars for 'managed_node1' 22225 1726882783.00100: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882783.00105: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882783.00182: Set connection var ansible_connection to ssh 22225 1726882783.00192: Set connection var ansible_pipelining to False 22225 1726882783.00200: Set connection var ansible_shell_executable to /bin/sh 22225 1726882783.00206: Set connection var ansible_timeout to 10 22225 1726882783.00208: Set connection var ansible_shell_type to sh 22225 1726882783.00214: Set connection var ansible_module_compression to ZIP_DEFLATED 22225 1726882783.00245: variable 'ansible_shell_executable' from source: unknown 22225 1726882783.00249: variable 'ansible_connection' from source: unknown 22225 1726882783.00252: variable 'ansible_module_compression' from source: unknown 22225 1726882783.00254: variable 'ansible_shell_type' from source: unknown 22225 1726882783.00257: variable 'ansible_shell_executable' from source: unknown 22225 1726882783.00259: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882783.00261: variable 'ansible_pipelining' from source: unknown 22225 1726882783.00264: variable 'ansible_timeout' from source: unknown 22225 1726882783.00272: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882783.00372: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22225 1726882783.00385: variable 'omit' from source: magic vars 22225 1726882783.00392: starting attempt loop 22225 1726882783.00395: running the handler 22225 1726882783.00405: handler run complete 22225 1726882783.00413: attempt loop complete, returning result 22225 1726882783.00416: _execute() done 22225 1726882783.00419: dumping result to json 22225 1726882783.00421: done dumping result, returning 22225 1726882783.00431: done running TaskExecutor() for managed_node1/TASK: Set current_interfaces [0affc7ec-ae25-ec05-55b7-00000000071c] 22225 1726882783.00438: sending task result for task 0affc7ec-ae25-ec05-55b7-00000000071c 22225 1726882783.00529: done sending task result for task 0affc7ec-ae25-ec05-55b7-00000000071c 22225 1726882783.00532: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo", "veth0" ] }, "changed": false } 22225 1726882783.00609: no more pending results, returning what we have 22225 1726882783.00612: results queue empty 22225 1726882783.00613: checking for any_errors_fatal 22225 1726882783.00624: done checking for any_errors_fatal 22225 1726882783.00625: checking for max_fail_percentage 22225 1726882783.00627: done checking for max_fail_percentage 22225 1726882783.00628: checking to see if all hosts have failed and the running result is not ok 22225 1726882783.00629: done checking to see if all hosts have failed 22225 1726882783.00629: getting the remaining hosts for this loop 22225 1726882783.00631: done getting the remaining hosts for this loop 22225 1726882783.00635: getting the next task for host managed_node1 22225 1726882783.00643: done getting next task for host managed_node1 22225 1726882783.00645: ^ task is: TASK: Show current_interfaces 22225 1726882783.00650: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22225 1726882783.00653: getting variables 22225 1726882783.00655: in VariableManager get_vars() 22225 1726882783.00695: Calling all_inventory to load vars for managed_node1 22225 1726882783.00697: Calling groups_inventory to load vars for managed_node1 22225 1726882783.00706: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882783.00717: Calling all_plugins_play to load vars for managed_node1 22225 1726882783.00719: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882783.00724: Calling groups_plugins_play to load vars for managed_node1 22225 1726882783.01838: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882783.02998: done with get_vars() 22225 1726882783.03016: done getting variables 22225 1726882783.03059: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 21:39:43 -0400 (0:00:00.045) 0:00:38.423 ****** 22225 1726882783.03084: entering _queue_task() for managed_node1/debug 22225 1726882783.03327: worker is 1 (out of 1 available) 22225 1726882783.03344: exiting _queue_task() for managed_node1/debug 22225 1726882783.03355: done queuing things up, now waiting for results queue to drain 22225 1726882783.03357: waiting for pending results... 22225 1726882783.03548: running TaskExecutor() for managed_node1/TASK: Show current_interfaces 22225 1726882783.03628: in run() - task 0affc7ec-ae25-ec05-55b7-0000000006e5 22225 1726882783.03641: variable 'ansible_search_path' from source: unknown 22225 1726882783.03645: variable 'ansible_search_path' from source: unknown 22225 1726882783.03675: calling self._execute() 22225 1726882783.03756: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882783.03761: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882783.03770: variable 'omit' from source: magic vars 22225 1726882783.04080: variable 'ansible_distribution_major_version' from source: facts 22225 1726882783.04093: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882783.04099: variable 'omit' from source: magic vars 22225 1726882783.04143: variable 'omit' from source: magic vars 22225 1726882783.04214: variable 'current_interfaces' from source: set_fact 22225 1726882783.04242: variable 'omit' from source: magic vars 22225 1726882783.04274: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22225 1726882783.04306: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22225 1726882783.04323: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22225 1726882783.04338: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882783.04349: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882783.04375: variable 'inventory_hostname' from source: host vars for 'managed_node1' 22225 1726882783.04378: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882783.04381: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882783.04456: Set connection var ansible_connection to ssh 22225 1726882783.04464: Set connection var ansible_pipelining to False 22225 1726882783.04471: Set connection var ansible_shell_executable to /bin/sh 22225 1726882783.04477: Set connection var ansible_timeout to 10 22225 1726882783.04480: Set connection var ansible_shell_type to sh 22225 1726882783.04487: Set connection var ansible_module_compression to ZIP_DEFLATED 22225 1726882783.04507: variable 'ansible_shell_executable' from source: unknown 22225 1726882783.04511: variable 'ansible_connection' from source: unknown 22225 1726882783.04514: variable 'ansible_module_compression' from source: unknown 22225 1726882783.04516: variable 'ansible_shell_type' from source: unknown 22225 1726882783.04519: variable 'ansible_shell_executable' from source: unknown 22225 1726882783.04523: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882783.04525: variable 'ansible_pipelining' from source: unknown 22225 1726882783.04528: variable 'ansible_timeout' from source: unknown 22225 1726882783.04533: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882783.04642: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22225 1726882783.04653: variable 'omit' from source: magic vars 22225 1726882783.04658: starting attempt loop 22225 1726882783.04661: running the handler 22225 1726882783.04705: handler run complete 22225 1726882783.04716: attempt loop complete, returning result 22225 1726882783.04719: _execute() done 22225 1726882783.04723: dumping result to json 22225 1726882783.04726: done dumping result, returning 22225 1726882783.04733: done running TaskExecutor() for managed_node1/TASK: Show current_interfaces [0affc7ec-ae25-ec05-55b7-0000000006e5] 22225 1726882783.04737: sending task result for task 0affc7ec-ae25-ec05-55b7-0000000006e5 22225 1726882783.04829: done sending task result for task 0affc7ec-ae25-ec05-55b7-0000000006e5 22225 1726882783.04832: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo', 'veth0'] 22225 1726882783.04881: no more pending results, returning what we have 22225 1726882783.04886: results queue empty 22225 1726882783.04887: checking for any_errors_fatal 22225 1726882783.04894: done checking for any_errors_fatal 22225 1726882783.04895: checking for max_fail_percentage 22225 1726882783.04897: done checking for max_fail_percentage 22225 1726882783.04898: checking to see if all hosts have failed and the running result is not ok 22225 1726882783.04898: done checking to see if all hosts have failed 22225 1726882783.04899: getting the remaining hosts for this loop 22225 1726882783.04901: done getting the remaining hosts for this loop 22225 1726882783.04905: getting the next task for host managed_node1 22225 1726882783.04913: done getting next task for host managed_node1 22225 1726882783.04916: ^ task is: TASK: Install iproute 22225 1726882783.04919: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22225 1726882783.04924: getting variables 22225 1726882783.04926: in VariableManager get_vars() 22225 1726882783.04964: Calling all_inventory to load vars for managed_node1 22225 1726882783.04966: Calling groups_inventory to load vars for managed_node1 22225 1726882783.04968: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882783.04979: Calling all_plugins_play to load vars for managed_node1 22225 1726882783.04981: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882783.04984: Calling groups_plugins_play to load vars for managed_node1 22225 1726882783.05968: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882783.07243: done with get_vars() 22225 1726882783.07261: done getting variables 22225 1726882783.07306: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Friday 20 September 2024 21:39:43 -0400 (0:00:00.042) 0:00:38.465 ****** 22225 1726882783.07333: entering _queue_task() for managed_node1/package 22225 1726882783.07585: worker is 1 (out of 1 available) 22225 1726882783.07598: exiting _queue_task() for managed_node1/package 22225 1726882783.07611: done queuing things up, now waiting for results queue to drain 22225 1726882783.07613: waiting for pending results... 22225 1726882783.07809: running TaskExecutor() for managed_node1/TASK: Install iproute 22225 1726882783.07885: in run() - task 0affc7ec-ae25-ec05-55b7-0000000005cf 22225 1726882783.07899: variable 'ansible_search_path' from source: unknown 22225 1726882783.07904: variable 'ansible_search_path' from source: unknown 22225 1726882783.07934: calling self._execute() 22225 1726882783.08016: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882783.08020: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882783.08031: variable 'omit' from source: magic vars 22225 1726882783.08339: variable 'ansible_distribution_major_version' from source: facts 22225 1726882783.08348: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882783.08354: variable 'omit' from source: magic vars 22225 1726882783.08388: variable 'omit' from source: magic vars 22225 1726882783.08539: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22225 1726882783.10146: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22225 1726882783.10198: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22225 1726882783.10228: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22225 1726882783.10267: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22225 1726882783.10291: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22225 1726882783.10371: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22225 1726882783.10394: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22225 1726882783.10412: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22225 1726882783.10443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22225 1726882783.10455: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22225 1726882783.10540: variable '__network_is_ostree' from source: set_fact 22225 1726882783.10544: variable 'omit' from source: magic vars 22225 1726882783.10570: variable 'omit' from source: magic vars 22225 1726882783.10597: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22225 1726882783.10618: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22225 1726882783.10634: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22225 1726882783.10648: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882783.10656: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882783.10688: variable 'inventory_hostname' from source: host vars for 'managed_node1' 22225 1726882783.10691: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882783.10693: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882783.10763: Set connection var ansible_connection to ssh 22225 1726882783.10771: Set connection var ansible_pipelining to False 22225 1726882783.10780: Set connection var ansible_shell_executable to /bin/sh 22225 1726882783.10790: Set connection var ansible_timeout to 10 22225 1726882783.10792: Set connection var ansible_shell_type to sh 22225 1726882783.10800: Set connection var ansible_module_compression to ZIP_DEFLATED 22225 1726882783.10818: variable 'ansible_shell_executable' from source: unknown 22225 1726882783.10821: variable 'ansible_connection' from source: unknown 22225 1726882783.10826: variable 'ansible_module_compression' from source: unknown 22225 1726882783.10828: variable 'ansible_shell_type' from source: unknown 22225 1726882783.10830: variable 'ansible_shell_executable' from source: unknown 22225 1726882783.10833: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882783.10838: variable 'ansible_pipelining' from source: unknown 22225 1726882783.10841: variable 'ansible_timeout' from source: unknown 22225 1726882783.10845: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882783.10927: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22225 1726882783.10936: variable 'omit' from source: magic vars 22225 1726882783.10942: starting attempt loop 22225 1726882783.10945: running the handler 22225 1726882783.10951: variable 'ansible_facts' from source: unknown 22225 1726882783.10954: variable 'ansible_facts' from source: unknown 22225 1726882783.10984: _low_level_execute_command(): starting 22225 1726882783.10992: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22225 1726882783.11530: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882783.11535: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882783.11537: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882783.11539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882783.11593: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882783.11597: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882783.11612: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882783.11674: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882783.13440: stdout chunk (state=3): >>>/root <<< 22225 1726882783.13548: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882783.13608: stderr chunk (state=3): >>><<< 22225 1726882783.13612: stdout chunk (state=3): >>><<< 22225 1726882783.13634: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882783.13646: _low_level_execute_command(): starting 22225 1726882783.13653: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882783.1363435-23588-61412219743217 `" && echo ansible-tmp-1726882783.1363435-23588-61412219743217="` echo /root/.ansible/tmp/ansible-tmp-1726882783.1363435-23588-61412219743217 `" ) && sleep 0' 22225 1726882783.14328: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22225 1726882783.14332: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882783.14335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882783.14338: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882783.14348: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 <<< 22225 1726882783.14355: stderr chunk (state=3): >>>debug2: match not found <<< 22225 1726882783.14394: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882783.14397: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 22225 1726882783.14400: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.7 is address <<< 22225 1726882783.14402: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 22225 1726882783.14405: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882783.14407: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882783.14420: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882783.14478: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 <<< 22225 1726882783.14481: stderr chunk (state=3): >>>debug2: match found <<< 22225 1726882783.14484: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882783.14519: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882783.14534: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882783.14550: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882783.14626: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882783.16627: stdout chunk (state=3): >>>ansible-tmp-1726882783.1363435-23588-61412219743217=/root/.ansible/tmp/ansible-tmp-1726882783.1363435-23588-61412219743217 <<< 22225 1726882783.16846: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882783.16849: stdout chunk (state=3): >>><<< 22225 1726882783.16852: stderr chunk (state=3): >>><<< 22225 1726882783.16937: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882783.1363435-23588-61412219743217=/root/.ansible/tmp/ansible-tmp-1726882783.1363435-23588-61412219743217 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882783.16940: variable 'ansible_module_compression' from source: unknown 22225 1726882783.16998: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-2222502xecypi/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 22225 1726882783.17047: variable 'ansible_facts' from source: unknown 22225 1726882783.17158: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882783.1363435-23588-61412219743217/AnsiballZ_dnf.py 22225 1726882783.17305: Sending initial data 22225 1726882783.17309: Sent initial data (151 bytes) 22225 1726882783.18139: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882783.18196: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882783.18209: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882783.18308: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882783.18381: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882783.20036: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22225 1726882783.20087: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22225 1726882783.20152: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2222502xecypi/tmpftnk6gn1 /root/.ansible/tmp/ansible-tmp-1726882783.1363435-23588-61412219743217/AnsiballZ_dnf.py <<< 22225 1726882783.20157: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882783.1363435-23588-61412219743217/AnsiballZ_dnf.py" <<< 22225 1726882783.20194: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-2222502xecypi/tmpftnk6gn1" to remote "/root/.ansible/tmp/ansible-tmp-1726882783.1363435-23588-61412219743217/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882783.1363435-23588-61412219743217/AnsiballZ_dnf.py" <<< 22225 1726882783.21210: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882783.21411: stderr chunk (state=3): >>><<< 22225 1726882783.21416: stdout chunk (state=3): >>><<< 22225 1726882783.21419: done transferring module to remote 22225 1726882783.21423: _low_level_execute_command(): starting 22225 1726882783.21426: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882783.1363435-23588-61412219743217/ /root/.ansible/tmp/ansible-tmp-1726882783.1363435-23588-61412219743217/AnsiballZ_dnf.py && sleep 0' 22225 1726882783.21965: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22225 1726882783.21976: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882783.21984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882783.21999: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882783.22010: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 <<< 22225 1726882783.22017: stderr chunk (state=3): >>>debug2: match not found <<< 22225 1726882783.22029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882783.22044: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 22225 1726882783.22052: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.7 is address <<< 22225 1726882783.22058: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 22225 1726882783.22066: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882783.22087: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882783.22191: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882783.22194: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882783.22244: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882783.24230: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882783.24234: stdout chunk (state=3): >>><<< 22225 1726882783.24237: stderr chunk (state=3): >>><<< 22225 1726882783.24342: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882783.24351: _low_level_execute_command(): starting 22225 1726882783.24356: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882783.1363435-23588-61412219743217/AnsiballZ_dnf.py && sleep 0' 22225 1726882783.24930: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882783.24934: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882783.24936: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882783.24939: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found <<< 22225 1726882783.24941: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882783.24999: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882783.25003: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882783.25006: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882783.25068: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882784.32658: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 22225 1726882784.37448: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. <<< 22225 1726882784.37505: stderr chunk (state=3): >>><<< 22225 1726882784.37509: stdout chunk (state=3): >>><<< 22225 1726882784.37526: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. 22225 1726882784.37568: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882783.1363435-23588-61412219743217/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22225 1726882784.37574: _low_level_execute_command(): starting 22225 1726882784.37581: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882783.1363435-23588-61412219743217/ > /dev/null 2>&1 && sleep 0' 22225 1726882784.38088: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882784.38092: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found <<< 22225 1726882784.38094: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882784.38096: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882784.38105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882784.38150: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882784.38164: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882784.38227: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882784.40156: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882784.40209: stderr chunk (state=3): >>><<< 22225 1726882784.40212: stdout chunk (state=3): >>><<< 22225 1726882784.40225: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882784.40235: handler run complete 22225 1726882784.40359: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22225 1726882784.40513: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22225 1726882784.40549: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22225 1726882784.40573: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22225 1726882784.40598: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22225 1726882784.40658: variable '__install_status' from source: set_fact 22225 1726882784.40672: Evaluated conditional (__install_status is success): True 22225 1726882784.40686: attempt loop complete, returning result 22225 1726882784.40692: _execute() done 22225 1726882784.40694: dumping result to json 22225 1726882784.40700: done dumping result, returning 22225 1726882784.40707: done running TaskExecutor() for managed_node1/TASK: Install iproute [0affc7ec-ae25-ec05-55b7-0000000005cf] 22225 1726882784.40712: sending task result for task 0affc7ec-ae25-ec05-55b7-0000000005cf 22225 1726882784.40821: done sending task result for task 0affc7ec-ae25-ec05-55b7-0000000005cf 22225 1726882784.40826: WORKER PROCESS EXITING ok: [managed_node1] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 22225 1726882784.40920: no more pending results, returning what we have 22225 1726882784.40929: results queue empty 22225 1726882784.40930: checking for any_errors_fatal 22225 1726882784.40938: done checking for any_errors_fatal 22225 1726882784.40939: checking for max_fail_percentage 22225 1726882784.40940: done checking for max_fail_percentage 22225 1726882784.40941: checking to see if all hosts have failed and the running result is not ok 22225 1726882784.40942: done checking to see if all hosts have failed 22225 1726882784.40942: getting the remaining hosts for this loop 22225 1726882784.40945: done getting the remaining hosts for this loop 22225 1726882784.40949: getting the next task for host managed_node1 22225 1726882784.40956: done getting next task for host managed_node1 22225 1726882784.40959: ^ task is: TASK: Create veth interface {{ interface }} 22225 1726882784.40962: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22225 1726882784.40965: getting variables 22225 1726882784.40967: in VariableManager get_vars() 22225 1726882784.41011: Calling all_inventory to load vars for managed_node1 22225 1726882784.41013: Calling groups_inventory to load vars for managed_node1 22225 1726882784.41015: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882784.41031: Calling all_plugins_play to load vars for managed_node1 22225 1726882784.41037: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882784.41041: Calling groups_plugins_play to load vars for managed_node1 22225 1726882784.42074: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882784.44120: done with get_vars() 22225 1726882784.44152: done getting variables 22225 1726882784.44227: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 22225 1726882784.44365: variable 'interface' from source: play vars TASK [Create veth interface veth0] ********************************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Friday 20 September 2024 21:39:44 -0400 (0:00:01.370) 0:00:39.836 ****** 22225 1726882784.44403: entering _queue_task() for managed_node1/command 22225 1726882784.44754: worker is 1 (out of 1 available) 22225 1726882784.44769: exiting _queue_task() for managed_node1/command 22225 1726882784.44784: done queuing things up, now waiting for results queue to drain 22225 1726882784.44786: waiting for pending results... 22225 1726882784.45242: running TaskExecutor() for managed_node1/TASK: Create veth interface veth0 22225 1726882784.45341: in run() - task 0affc7ec-ae25-ec05-55b7-0000000005d0 22225 1726882784.45344: variable 'ansible_search_path' from source: unknown 22225 1726882784.45347: variable 'ansible_search_path' from source: unknown 22225 1726882784.45598: variable 'interface' from source: play vars 22225 1726882784.45700: variable 'interface' from source: play vars 22225 1726882784.45796: variable 'interface' from source: play vars 22225 1726882784.45966: Loaded config def from plugin (lookup/items) 22225 1726882784.45983: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 22225 1726882784.46018: variable 'omit' from source: magic vars 22225 1726882784.46174: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882784.46194: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882784.46218: variable 'omit' from source: magic vars 22225 1726882784.46487: variable 'ansible_distribution_major_version' from source: facts 22225 1726882784.46540: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882784.46734: variable 'type' from source: play vars 22225 1726882784.46745: variable 'state' from source: include params 22225 1726882784.46760: variable 'interface' from source: play vars 22225 1726882784.46770: variable 'current_interfaces' from source: set_fact 22225 1726882784.46785: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): False 22225 1726882784.46826: when evaluation is False, skipping this task 22225 1726882784.46830: variable 'item' from source: unknown 22225 1726882784.46910: variable 'item' from source: unknown skipping: [managed_node1] => (item=ip link add veth0 type veth peer name peerveth0) => { "ansible_loop_var": "item", "changed": false, "false_condition": "type == 'veth' and state == 'present' and interface not in current_interfaces", "item": "ip link add veth0 type veth peer name peerveth0", "skip_reason": "Conditional result was False" } 22225 1726882784.47329: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882784.47332: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882784.47335: variable 'omit' from source: magic vars 22225 1726882784.47339: variable 'ansible_distribution_major_version' from source: facts 22225 1726882784.47349: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882784.47560: variable 'type' from source: play vars 22225 1726882784.47568: variable 'state' from source: include params 22225 1726882784.47575: variable 'interface' from source: play vars 22225 1726882784.47586: variable 'current_interfaces' from source: set_fact 22225 1726882784.47596: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): False 22225 1726882784.47601: when evaluation is False, skipping this task 22225 1726882784.47634: variable 'item' from source: unknown 22225 1726882784.47707: variable 'item' from source: unknown skipping: [managed_node1] => (item=ip link set peerveth0 up) => { "ansible_loop_var": "item", "changed": false, "false_condition": "type == 'veth' and state == 'present' and interface not in current_interfaces", "item": "ip link set peerveth0 up", "skip_reason": "Conditional result was False" } 22225 1726882784.48027: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882784.48031: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882784.48035: variable 'omit' from source: magic vars 22225 1726882784.48054: variable 'ansible_distribution_major_version' from source: facts 22225 1726882784.48065: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882784.48274: variable 'type' from source: play vars 22225 1726882784.48288: variable 'state' from source: include params 22225 1726882784.48298: variable 'interface' from source: play vars 22225 1726882784.48307: variable 'current_interfaces' from source: set_fact 22225 1726882784.48317: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): False 22225 1726882784.48326: when evaluation is False, skipping this task 22225 1726882784.48359: variable 'item' from source: unknown 22225 1726882784.48435: variable 'item' from source: unknown skipping: [managed_node1] => (item=ip link set veth0 up) => { "ansible_loop_var": "item", "changed": false, "false_condition": "type == 'veth' and state == 'present' and interface not in current_interfaces", "item": "ip link set veth0 up", "skip_reason": "Conditional result was False" } 22225 1726882784.48727: dumping result to json 22225 1726882784.48731: done dumping result, returning 22225 1726882784.48734: done running TaskExecutor() for managed_node1/TASK: Create veth interface veth0 [0affc7ec-ae25-ec05-55b7-0000000005d0] 22225 1726882784.48737: sending task result for task 0affc7ec-ae25-ec05-55b7-0000000005d0 22225 1726882784.48785: done sending task result for task 0affc7ec-ae25-ec05-55b7-0000000005d0 22225 1726882784.48788: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false } MSG: All items skipped 22225 1726882784.48832: no more pending results, returning what we have 22225 1726882784.48838: results queue empty 22225 1726882784.48839: checking for any_errors_fatal 22225 1726882784.48849: done checking for any_errors_fatal 22225 1726882784.48850: checking for max_fail_percentage 22225 1726882784.48852: done checking for max_fail_percentage 22225 1726882784.48853: checking to see if all hosts have failed and the running result is not ok 22225 1726882784.48854: done checking to see if all hosts have failed 22225 1726882784.48855: getting the remaining hosts for this loop 22225 1726882784.48856: done getting the remaining hosts for this loop 22225 1726882784.48861: getting the next task for host managed_node1 22225 1726882784.48868: done getting next task for host managed_node1 22225 1726882784.48872: ^ task is: TASK: Set up veth as managed by NetworkManager 22225 1726882784.48875: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22225 1726882784.48882: getting variables 22225 1726882784.48884: in VariableManager get_vars() 22225 1726882784.48936: Calling all_inventory to load vars for managed_node1 22225 1726882784.48939: Calling groups_inventory to load vars for managed_node1 22225 1726882784.48942: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882784.48956: Calling all_plugins_play to load vars for managed_node1 22225 1726882784.48958: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882784.48962: Calling groups_plugins_play to load vars for managed_node1 22225 1726882784.50970: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882784.52144: done with get_vars() 22225 1726882784.52163: done getting variables 22225 1726882784.52212: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Friday 20 September 2024 21:39:44 -0400 (0:00:00.078) 0:00:39.915 ****** 22225 1726882784.52241: entering _queue_task() for managed_node1/command 22225 1726882784.52512: worker is 1 (out of 1 available) 22225 1726882784.52530: exiting _queue_task() for managed_node1/command 22225 1726882784.52544: done queuing things up, now waiting for results queue to drain 22225 1726882784.52546: waiting for pending results... 22225 1726882784.52969: running TaskExecutor() for managed_node1/TASK: Set up veth as managed by NetworkManager 22225 1726882784.52975: in run() - task 0affc7ec-ae25-ec05-55b7-0000000005d1 22225 1726882784.52991: variable 'ansible_search_path' from source: unknown 22225 1726882784.52999: variable 'ansible_search_path' from source: unknown 22225 1726882784.53047: calling self._execute() 22225 1726882784.53182: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882784.53197: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882784.53214: variable 'omit' from source: magic vars 22225 1726882784.53745: variable 'ansible_distribution_major_version' from source: facts 22225 1726882784.53757: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882784.53897: variable 'type' from source: play vars 22225 1726882784.53901: variable 'state' from source: include params 22225 1726882784.53907: Evaluated conditional (type == 'veth' and state == 'present'): False 22225 1726882784.53910: when evaluation is False, skipping this task 22225 1726882784.53913: _execute() done 22225 1726882784.53916: dumping result to json 22225 1726882784.53918: done dumping result, returning 22225 1726882784.53925: done running TaskExecutor() for managed_node1/TASK: Set up veth as managed by NetworkManager [0affc7ec-ae25-ec05-55b7-0000000005d1] 22225 1726882784.53931: sending task result for task 0affc7ec-ae25-ec05-55b7-0000000005d1 22225 1726882784.54028: done sending task result for task 0affc7ec-ae25-ec05-55b7-0000000005d1 22225 1726882784.54031: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'veth' and state == 'present'", "skip_reason": "Conditional result was False" } 22225 1726882784.54090: no more pending results, returning what we have 22225 1726882784.54095: results queue empty 22225 1726882784.54097: checking for any_errors_fatal 22225 1726882784.54109: done checking for any_errors_fatal 22225 1726882784.54110: checking for max_fail_percentage 22225 1726882784.54111: done checking for max_fail_percentage 22225 1726882784.54112: checking to see if all hosts have failed and the running result is not ok 22225 1726882784.54113: done checking to see if all hosts have failed 22225 1726882784.54114: getting the remaining hosts for this loop 22225 1726882784.54115: done getting the remaining hosts for this loop 22225 1726882784.54119: getting the next task for host managed_node1 22225 1726882784.54126: done getting next task for host managed_node1 22225 1726882784.54129: ^ task is: TASK: Delete veth interface {{ interface }} 22225 1726882784.54132: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22225 1726882784.54136: getting variables 22225 1726882784.54138: in VariableManager get_vars() 22225 1726882784.54177: Calling all_inventory to load vars for managed_node1 22225 1726882784.54182: Calling groups_inventory to load vars for managed_node1 22225 1726882784.54184: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882784.54195: Calling all_plugins_play to load vars for managed_node1 22225 1726882784.54198: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882784.54200: Calling groups_plugins_play to load vars for managed_node1 22225 1726882784.55181: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882784.56944: done with get_vars() 22225 1726882784.56970: done getting variables 22225 1726882784.57089: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 22225 1726882784.57251: variable 'interface' from source: play vars TASK [Delete veth interface veth0] ********************************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Friday 20 September 2024 21:39:44 -0400 (0:00:00.050) 0:00:39.965 ****** 22225 1726882784.57275: entering _queue_task() for managed_node1/command 22225 1726882784.57560: worker is 1 (out of 1 available) 22225 1726882784.57576: exiting _queue_task() for managed_node1/command 22225 1726882784.57587: done queuing things up, now waiting for results queue to drain 22225 1726882784.57588: waiting for pending results... 22225 1726882784.57780: running TaskExecutor() for managed_node1/TASK: Delete veth interface veth0 22225 1726882784.57870: in run() - task 0affc7ec-ae25-ec05-55b7-0000000005d2 22225 1726882784.57880: variable 'ansible_search_path' from source: unknown 22225 1726882784.57886: variable 'ansible_search_path' from source: unknown 22225 1726882784.57922: calling self._execute() 22225 1726882784.58000: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882784.58004: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882784.58014: variable 'omit' from source: magic vars 22225 1726882784.58309: variable 'ansible_distribution_major_version' from source: facts 22225 1726882784.58319: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882784.58462: variable 'type' from source: play vars 22225 1726882784.58465: variable 'state' from source: include params 22225 1726882784.58472: variable 'interface' from source: play vars 22225 1726882784.58475: variable 'current_interfaces' from source: set_fact 22225 1726882784.58488: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): True 22225 1726882784.58491: variable 'omit' from source: magic vars 22225 1726882784.58521: variable 'omit' from source: magic vars 22225 1726882784.58596: variable 'interface' from source: play vars 22225 1726882784.58610: variable 'omit' from source: magic vars 22225 1726882784.58646: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22225 1726882784.58675: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22225 1726882784.58696: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22225 1726882784.58712: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882784.58724: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882784.58749: variable 'inventory_hostname' from source: host vars for 'managed_node1' 22225 1726882784.58752: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882784.58755: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882784.58832: Set connection var ansible_connection to ssh 22225 1726882784.58840: Set connection var ansible_pipelining to False 22225 1726882784.58847: Set connection var ansible_shell_executable to /bin/sh 22225 1726882784.58853: Set connection var ansible_timeout to 10 22225 1726882784.58856: Set connection var ansible_shell_type to sh 22225 1726882784.58861: Set connection var ansible_module_compression to ZIP_DEFLATED 22225 1726882784.58886: variable 'ansible_shell_executable' from source: unknown 22225 1726882784.58890: variable 'ansible_connection' from source: unknown 22225 1726882784.58894: variable 'ansible_module_compression' from source: unknown 22225 1726882784.58896: variable 'ansible_shell_type' from source: unknown 22225 1726882784.58899: variable 'ansible_shell_executable' from source: unknown 22225 1726882784.58901: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882784.58904: variable 'ansible_pipelining' from source: unknown 22225 1726882784.58906: variable 'ansible_timeout' from source: unknown 22225 1726882784.58957: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882784.59047: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22225 1726882784.59164: variable 'omit' from source: magic vars 22225 1726882784.59168: starting attempt loop 22225 1726882784.59171: running the handler 22225 1726882784.59173: _low_level_execute_command(): starting 22225 1726882784.59175: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22225 1726882784.59862: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882784.59896: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882784.59914: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882784.59984: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882784.61708: stdout chunk (state=3): >>>/root <<< 22225 1726882784.61815: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882784.61873: stderr chunk (state=3): >>><<< 22225 1726882784.61876: stdout chunk (state=3): >>><<< 22225 1726882784.61897: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882784.61910: _low_level_execute_command(): starting 22225 1726882784.61915: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882784.6189806-23642-55593643970637 `" && echo ansible-tmp-1726882784.6189806-23642-55593643970637="` echo /root/.ansible/tmp/ansible-tmp-1726882784.6189806-23642-55593643970637 `" ) && sleep 0' 22225 1726882784.62383: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882784.62387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882784.62398: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882784.62400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found <<< 22225 1726882784.62403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882784.62445: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882784.62448: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882784.62517: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882784.64470: stdout chunk (state=3): >>>ansible-tmp-1726882784.6189806-23642-55593643970637=/root/.ansible/tmp/ansible-tmp-1726882784.6189806-23642-55593643970637 <<< 22225 1726882784.64591: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882784.64639: stderr chunk (state=3): >>><<< 22225 1726882784.64643: stdout chunk (state=3): >>><<< 22225 1726882784.64658: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882784.6189806-23642-55593643970637=/root/.ansible/tmp/ansible-tmp-1726882784.6189806-23642-55593643970637 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882784.64689: variable 'ansible_module_compression' from source: unknown 22225 1726882784.64736: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-2222502xecypi/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 22225 1726882784.64768: variable 'ansible_facts' from source: unknown 22225 1726882784.64824: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882784.6189806-23642-55593643970637/AnsiballZ_command.py 22225 1726882784.64931: Sending initial data 22225 1726882784.64935: Sent initial data (155 bytes) 22225 1726882784.65397: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882784.65400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found <<< 22225 1726882784.65403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882784.65405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882784.65408: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882784.65458: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882784.65462: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882784.65515: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882784.67084: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 22225 1726882784.67091: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22225 1726882784.67132: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22225 1726882784.67181: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2222502xecypi/tmprhnem3u0 /root/.ansible/tmp/ansible-tmp-1726882784.6189806-23642-55593643970637/AnsiballZ_command.py <<< 22225 1726882784.67184: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882784.6189806-23642-55593643970637/AnsiballZ_command.py" <<< 22225 1726882784.67235: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-2222502xecypi/tmprhnem3u0" to remote "/root/.ansible/tmp/ansible-tmp-1726882784.6189806-23642-55593643970637/AnsiballZ_command.py" <<< 22225 1726882784.67243: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882784.6189806-23642-55593643970637/AnsiballZ_command.py" <<< 22225 1726882784.67836: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882784.67902: stderr chunk (state=3): >>><<< 22225 1726882784.67907: stdout chunk (state=3): >>><<< 22225 1726882784.67926: done transferring module to remote 22225 1726882784.67938: _low_level_execute_command(): starting 22225 1726882784.67943: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882784.6189806-23642-55593643970637/ /root/.ansible/tmp/ansible-tmp-1726882784.6189806-23642-55593643970637/AnsiballZ_command.py && sleep 0' 22225 1726882784.68406: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882784.68409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found <<< 22225 1726882784.68411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882784.68414: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882784.68419: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found <<< 22225 1726882784.68423: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882784.68469: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882784.68473: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882784.68529: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882784.70334: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882784.70387: stderr chunk (state=3): >>><<< 22225 1726882784.70391: stdout chunk (state=3): >>><<< 22225 1726882784.70406: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882784.70409: _low_level_execute_command(): starting 22225 1726882784.70414: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882784.6189806-23642-55593643970637/AnsiballZ_command.py && sleep 0' 22225 1726882784.70872: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882784.70875: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found <<< 22225 1726882784.70878: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882784.70883: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882784.70885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882784.70930: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882784.70950: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882784.70999: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882784.88872: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "veth0", "type", "veth"], "start": "2024-09-20 21:39:44.873583", "end": "2024-09-20 21:39:44.886038", "delta": "0:00:00.012455", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del veth0 type veth", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 22225 1726882784.91698: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. <<< 22225 1726882784.91702: stdout chunk (state=3): >>><<< 22225 1726882784.91705: stderr chunk (state=3): >>><<< 22225 1726882784.91707: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "veth0", "type", "veth"], "start": "2024-09-20 21:39:44.873583", "end": "2024-09-20 21:39:44.886038", "delta": "0:00:00.012455", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del veth0 type veth", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. 22225 1726882784.91711: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del veth0 type veth', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882784.6189806-23642-55593643970637/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22225 1726882784.91714: _low_level_execute_command(): starting 22225 1726882784.91716: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882784.6189806-23642-55593643970637/ > /dev/null 2>&1 && sleep 0' 22225 1726882784.92358: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22225 1726882784.92376: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882784.92397: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882784.92418: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882784.92470: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882784.92489: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882784.92585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882784.92602: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882784.92626: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882784.92714: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882784.94743: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882784.94774: stderr chunk (state=3): >>><<< 22225 1726882784.94793: stdout chunk (state=3): >>><<< 22225 1726882784.94820: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882784.94837: handler run complete 22225 1726882784.94868: Evaluated conditional (False): False 22225 1726882784.94886: attempt loop complete, returning result 22225 1726882784.94931: _execute() done 22225 1726882784.94934: dumping result to json 22225 1726882784.94936: done dumping result, returning 22225 1726882784.94939: done running TaskExecutor() for managed_node1/TASK: Delete veth interface veth0 [0affc7ec-ae25-ec05-55b7-0000000005d2] 22225 1726882784.94941: sending task result for task 0affc7ec-ae25-ec05-55b7-0000000005d2 ok: [managed_node1] => { "changed": false, "cmd": [ "ip", "link", "del", "veth0", "type", "veth" ], "delta": "0:00:00.012455", "end": "2024-09-20 21:39:44.886038", "rc": 0, "start": "2024-09-20 21:39:44.873583" } 22225 1726882784.95200: no more pending results, returning what we have 22225 1726882784.95204: results queue empty 22225 1726882784.95205: checking for any_errors_fatal 22225 1726882784.95214: done checking for any_errors_fatal 22225 1726882784.95215: checking for max_fail_percentage 22225 1726882784.95217: done checking for max_fail_percentage 22225 1726882784.95218: checking to see if all hosts have failed and the running result is not ok 22225 1726882784.95219: done checking to see if all hosts have failed 22225 1726882784.95220: getting the remaining hosts for this loop 22225 1726882784.95224: done getting the remaining hosts for this loop 22225 1726882784.95229: getting the next task for host managed_node1 22225 1726882784.95238: done getting next task for host managed_node1 22225 1726882784.95241: ^ task is: TASK: Create dummy interface {{ interface }} 22225 1726882784.95245: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22225 1726882784.95250: getting variables 22225 1726882784.95252: in VariableManager get_vars() 22225 1726882784.95300: Calling all_inventory to load vars for managed_node1 22225 1726882784.95304: Calling groups_inventory to load vars for managed_node1 22225 1726882784.95306: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882784.95319: Calling all_plugins_play to load vars for managed_node1 22225 1726882784.95637: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882784.95644: Calling groups_plugins_play to load vars for managed_node1 22225 1726882784.96165: done sending task result for task 0affc7ec-ae25-ec05-55b7-0000000005d2 22225 1726882784.96169: WORKER PROCESS EXITING 22225 1726882784.97379: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882784.99579: done with get_vars() 22225 1726882784.99612: done getting variables 22225 1726882784.99678: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 22225 1726882784.99804: variable 'interface' from source: play vars TASK [Create dummy interface veth0] ******************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Friday 20 September 2024 21:39:44 -0400 (0:00:00.425) 0:00:40.391 ****** 22225 1726882784.99837: entering _queue_task() for managed_node1/command 22225 1726882785.00232: worker is 1 (out of 1 available) 22225 1726882785.00248: exiting _queue_task() for managed_node1/command 22225 1726882785.00261: done queuing things up, now waiting for results queue to drain 22225 1726882785.00263: waiting for pending results... 22225 1726882785.00647: running TaskExecutor() for managed_node1/TASK: Create dummy interface veth0 22225 1726882785.00732: in run() - task 0affc7ec-ae25-ec05-55b7-0000000005d3 22225 1726882785.00760: variable 'ansible_search_path' from source: unknown 22225 1726882785.00775: variable 'ansible_search_path' from source: unknown 22225 1726882785.00820: calling self._execute() 22225 1726882785.00950: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882785.00977: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882785.01086: variable 'omit' from source: magic vars 22225 1726882785.01423: variable 'ansible_distribution_major_version' from source: facts 22225 1726882785.01445: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882785.01678: variable 'type' from source: play vars 22225 1726882785.01691: variable 'state' from source: include params 22225 1726882785.01700: variable 'interface' from source: play vars 22225 1726882785.01710: variable 'current_interfaces' from source: set_fact 22225 1726882785.01721: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 22225 1726882785.01731: when evaluation is False, skipping this task 22225 1726882785.01747: _execute() done 22225 1726882785.01759: dumping result to json 22225 1726882785.01767: done dumping result, returning 22225 1726882785.01780: done running TaskExecutor() for managed_node1/TASK: Create dummy interface veth0 [0affc7ec-ae25-ec05-55b7-0000000005d3] 22225 1726882785.01791: sending task result for task 0affc7ec-ae25-ec05-55b7-0000000005d3 skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 22225 1726882785.02057: no more pending results, returning what we have 22225 1726882785.02069: results queue empty 22225 1726882785.02070: checking for any_errors_fatal 22225 1726882785.02083: done checking for any_errors_fatal 22225 1726882785.02084: checking for max_fail_percentage 22225 1726882785.02086: done checking for max_fail_percentage 22225 1726882785.02087: checking to see if all hosts have failed and the running result is not ok 22225 1726882785.02088: done checking to see if all hosts have failed 22225 1726882785.02089: getting the remaining hosts for this loop 22225 1726882785.02092: done getting the remaining hosts for this loop 22225 1726882785.02096: getting the next task for host managed_node1 22225 1726882785.02104: done getting next task for host managed_node1 22225 1726882785.02107: ^ task is: TASK: Delete dummy interface {{ interface }} 22225 1726882785.02111: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22225 1726882785.02116: getting variables 22225 1726882785.02118: in VariableManager get_vars() 22225 1726882785.02285: Calling all_inventory to load vars for managed_node1 22225 1726882785.02288: Calling groups_inventory to load vars for managed_node1 22225 1726882785.02291: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882785.02304: Calling all_plugins_play to load vars for managed_node1 22225 1726882785.02306: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882785.02309: Calling groups_plugins_play to load vars for managed_node1 22225 1726882785.03139: done sending task result for task 0affc7ec-ae25-ec05-55b7-0000000005d3 22225 1726882785.03143: WORKER PROCESS EXITING 22225 1726882785.04421: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882785.08835: done with get_vars() 22225 1726882785.08931: done getting variables 22225 1726882785.09118: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 22225 1726882785.09354: variable 'interface' from source: play vars TASK [Delete dummy interface veth0] ******************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Friday 20 September 2024 21:39:45 -0400 (0:00:00.095) 0:00:40.486 ****** 22225 1726882785.09383: entering _queue_task() for managed_node1/command 22225 1726882785.10005: worker is 1 (out of 1 available) 22225 1726882785.10019: exiting _queue_task() for managed_node1/command 22225 1726882785.10040: done queuing things up, now waiting for results queue to drain 22225 1726882785.10042: waiting for pending results... 22225 1726882785.10813: running TaskExecutor() for managed_node1/TASK: Delete dummy interface veth0 22225 1726882785.10972: in run() - task 0affc7ec-ae25-ec05-55b7-0000000005d4 22225 1726882785.11003: variable 'ansible_search_path' from source: unknown 22225 1726882785.11012: variable 'ansible_search_path' from source: unknown 22225 1726882785.11067: calling self._execute() 22225 1726882785.11195: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882785.11220: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882785.11243: variable 'omit' from source: magic vars 22225 1726882785.11746: variable 'ansible_distribution_major_version' from source: facts 22225 1726882785.11770: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882785.12035: variable 'type' from source: play vars 22225 1726882785.12048: variable 'state' from source: include params 22225 1726882785.12057: variable 'interface' from source: play vars 22225 1726882785.12065: variable 'current_interfaces' from source: set_fact 22225 1726882785.12077: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 22225 1726882785.12089: when evaluation is False, skipping this task 22225 1726882785.12097: _execute() done 22225 1726882785.12104: dumping result to json 22225 1726882785.12113: done dumping result, returning 22225 1726882785.12131: done running TaskExecutor() for managed_node1/TASK: Delete dummy interface veth0 [0affc7ec-ae25-ec05-55b7-0000000005d4] 22225 1726882785.12143: sending task result for task 0affc7ec-ae25-ec05-55b7-0000000005d4 22225 1726882785.12304: done sending task result for task 0affc7ec-ae25-ec05-55b7-0000000005d4 22225 1726882785.12308: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 22225 1726882785.12375: no more pending results, returning what we have 22225 1726882785.12381: results queue empty 22225 1726882785.12382: checking for any_errors_fatal 22225 1726882785.12390: done checking for any_errors_fatal 22225 1726882785.12391: checking for max_fail_percentage 22225 1726882785.12393: done checking for max_fail_percentage 22225 1726882785.12395: checking to see if all hosts have failed and the running result is not ok 22225 1726882785.12396: done checking to see if all hosts have failed 22225 1726882785.12396: getting the remaining hosts for this loop 22225 1726882785.12399: done getting the remaining hosts for this loop 22225 1726882785.12404: getting the next task for host managed_node1 22225 1726882785.12412: done getting next task for host managed_node1 22225 1726882785.12417: ^ task is: TASK: Create tap interface {{ interface }} 22225 1726882785.12524: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22225 1726882785.12531: getting variables 22225 1726882785.12533: in VariableManager get_vars() 22225 1726882785.12649: Calling all_inventory to load vars for managed_node1 22225 1726882785.12652: Calling groups_inventory to load vars for managed_node1 22225 1726882785.12654: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882785.12669: Calling all_plugins_play to load vars for managed_node1 22225 1726882785.12680: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882785.12684: Calling groups_plugins_play to load vars for managed_node1 22225 1726882785.14661: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882785.17231: done with get_vars() 22225 1726882785.17262: done getting variables 22225 1726882785.17326: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 22225 1726882785.17444: variable 'interface' from source: play vars TASK [Create tap interface veth0] ********************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Friday 20 September 2024 21:39:45 -0400 (0:00:00.080) 0:00:40.567 ****** 22225 1726882785.17481: entering _queue_task() for managed_node1/command 22225 1726882785.17948: worker is 1 (out of 1 available) 22225 1726882785.17962: exiting _queue_task() for managed_node1/command 22225 1726882785.17974: done queuing things up, now waiting for results queue to drain 22225 1726882785.17976: waiting for pending results... 22225 1726882785.18326: running TaskExecutor() for managed_node1/TASK: Create tap interface veth0 22225 1726882785.18450: in run() - task 0affc7ec-ae25-ec05-55b7-0000000005d5 22225 1726882785.18454: variable 'ansible_search_path' from source: unknown 22225 1726882785.18457: variable 'ansible_search_path' from source: unknown 22225 1726882785.18460: calling self._execute() 22225 1726882785.18560: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882785.18578: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882785.18594: variable 'omit' from source: magic vars 22225 1726882785.19027: variable 'ansible_distribution_major_version' from source: facts 22225 1726882785.19045: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882785.19316: variable 'type' from source: play vars 22225 1726882785.19319: variable 'state' from source: include params 22225 1726882785.19323: variable 'interface' from source: play vars 22225 1726882785.19326: variable 'current_interfaces' from source: set_fact 22225 1726882785.19329: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 22225 1726882785.19332: when evaluation is False, skipping this task 22225 1726882785.19344: _execute() done 22225 1726882785.19427: dumping result to json 22225 1726882785.19433: done dumping result, returning 22225 1726882785.19435: done running TaskExecutor() for managed_node1/TASK: Create tap interface veth0 [0affc7ec-ae25-ec05-55b7-0000000005d5] 22225 1726882785.19437: sending task result for task 0affc7ec-ae25-ec05-55b7-0000000005d5 22225 1726882785.19513: done sending task result for task 0affc7ec-ae25-ec05-55b7-0000000005d5 22225 1726882785.19517: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 22225 1726882785.19579: no more pending results, returning what we have 22225 1726882785.19583: results queue empty 22225 1726882785.19585: checking for any_errors_fatal 22225 1726882785.19596: done checking for any_errors_fatal 22225 1726882785.19597: checking for max_fail_percentage 22225 1726882785.19599: done checking for max_fail_percentage 22225 1726882785.19600: checking to see if all hosts have failed and the running result is not ok 22225 1726882785.19601: done checking to see if all hosts have failed 22225 1726882785.19602: getting the remaining hosts for this loop 22225 1726882785.19604: done getting the remaining hosts for this loop 22225 1726882785.19609: getting the next task for host managed_node1 22225 1726882785.19616: done getting next task for host managed_node1 22225 1726882785.19619: ^ task is: TASK: Delete tap interface {{ interface }} 22225 1726882785.19626: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22225 1726882785.19632: getting variables 22225 1726882785.19635: in VariableManager get_vars() 22225 1726882785.19685: Calling all_inventory to load vars for managed_node1 22225 1726882785.19688: Calling groups_inventory to load vars for managed_node1 22225 1726882785.19691: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882785.19707: Calling all_plugins_play to load vars for managed_node1 22225 1726882785.19710: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882785.19713: Calling groups_plugins_play to load vars for managed_node1 22225 1726882785.21757: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882785.23862: done with get_vars() 22225 1726882785.23888: done getting variables 22225 1726882785.23956: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 22225 1726882785.24084: variable 'interface' from source: play vars TASK [Delete tap interface veth0] ********************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Friday 20 September 2024 21:39:45 -0400 (0:00:00.066) 0:00:40.633 ****** 22225 1726882785.24116: entering _queue_task() for managed_node1/command 22225 1726882785.24471: worker is 1 (out of 1 available) 22225 1726882785.24486: exiting _queue_task() for managed_node1/command 22225 1726882785.24500: done queuing things up, now waiting for results queue to drain 22225 1726882785.24501: waiting for pending results... 22225 1726882785.24939: running TaskExecutor() for managed_node1/TASK: Delete tap interface veth0 22225 1726882785.24954: in run() - task 0affc7ec-ae25-ec05-55b7-0000000005d6 22225 1726882785.24965: variable 'ansible_search_path' from source: unknown 22225 1726882785.24973: variable 'ansible_search_path' from source: unknown 22225 1726882785.25015: calling self._execute() 22225 1726882785.25131: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882785.25144: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882785.25169: variable 'omit' from source: magic vars 22225 1726882785.25608: variable 'ansible_distribution_major_version' from source: facts 22225 1726882785.25611: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882785.25833: variable 'type' from source: play vars 22225 1726882785.25845: variable 'state' from source: include params 22225 1726882785.25854: variable 'interface' from source: play vars 22225 1726882785.25863: variable 'current_interfaces' from source: set_fact 22225 1726882785.25932: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 22225 1726882785.25936: when evaluation is False, skipping this task 22225 1726882785.25939: _execute() done 22225 1726882785.25941: dumping result to json 22225 1726882785.25944: done dumping result, returning 22225 1726882785.25946: done running TaskExecutor() for managed_node1/TASK: Delete tap interface veth0 [0affc7ec-ae25-ec05-55b7-0000000005d6] 22225 1726882785.25949: sending task result for task 0affc7ec-ae25-ec05-55b7-0000000005d6 skipping: [managed_node1] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 22225 1726882785.26087: no more pending results, returning what we have 22225 1726882785.26092: results queue empty 22225 1726882785.26093: checking for any_errors_fatal 22225 1726882785.26101: done checking for any_errors_fatal 22225 1726882785.26102: checking for max_fail_percentage 22225 1726882785.26104: done checking for max_fail_percentage 22225 1726882785.26105: checking to see if all hosts have failed and the running result is not ok 22225 1726882785.26106: done checking to see if all hosts have failed 22225 1726882785.26107: getting the remaining hosts for this loop 22225 1726882785.26109: done getting the remaining hosts for this loop 22225 1726882785.26114: getting the next task for host managed_node1 22225 1726882785.26124: done getting next task for host managed_node1 22225 1726882785.26128: ^ task is: TASK: Clean up namespace 22225 1726882785.26131: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=6, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882785.26137: getting variables 22225 1726882785.26139: in VariableManager get_vars() 22225 1726882785.26186: Calling all_inventory to load vars for managed_node1 22225 1726882785.26189: Calling groups_inventory to load vars for managed_node1 22225 1726882785.26191: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882785.26207: Calling all_plugins_play to load vars for managed_node1 22225 1726882785.26211: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882785.26214: Calling groups_plugins_play to load vars for managed_node1 22225 1726882785.26434: done sending task result for task 0affc7ec-ae25-ec05-55b7-0000000005d6 22225 1726882785.26438: WORKER PROCESS EXITING 22225 1726882785.33193: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882785.35277: done with get_vars() 22225 1726882785.35307: done getting variables 22225 1726882785.35367: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Clean up namespace] ****************************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:108 Friday 20 September 2024 21:39:45 -0400 (0:00:00.112) 0:00:40.746 ****** 22225 1726882785.35394: entering _queue_task() for managed_node1/command 22225 1726882785.35775: worker is 1 (out of 1 available) 22225 1726882785.35795: exiting _queue_task() for managed_node1/command 22225 1726882785.35808: done queuing things up, now waiting for results queue to drain 22225 1726882785.35809: waiting for pending results... 22225 1726882785.36122: running TaskExecutor() for managed_node1/TASK: Clean up namespace 22225 1726882785.36248: in run() - task 0affc7ec-ae25-ec05-55b7-0000000000b4 22225 1726882785.36271: variable 'ansible_search_path' from source: unknown 22225 1726882785.36318: calling self._execute() 22225 1726882785.36451: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882785.36473: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882785.36488: variable 'omit' from source: magic vars 22225 1726882785.36918: variable 'ansible_distribution_major_version' from source: facts 22225 1726882785.37007: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882785.37011: variable 'omit' from source: magic vars 22225 1726882785.37013: variable 'omit' from source: magic vars 22225 1726882785.37019: variable 'omit' from source: magic vars 22225 1726882785.37064: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22225 1726882785.37106: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22225 1726882785.37139: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22225 1726882785.37162: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882785.37180: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882785.37231: variable 'inventory_hostname' from source: host vars for 'managed_node1' 22225 1726882785.37243: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882785.37251: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882785.37367: Set connection var ansible_connection to ssh 22225 1726882785.37428: Set connection var ansible_pipelining to False 22225 1726882785.37432: Set connection var ansible_shell_executable to /bin/sh 22225 1726882785.37440: Set connection var ansible_timeout to 10 22225 1726882785.37448: Set connection var ansible_shell_type to sh 22225 1726882785.37452: Set connection var ansible_module_compression to ZIP_DEFLATED 22225 1726882785.37468: variable 'ansible_shell_executable' from source: unknown 22225 1726882785.37477: variable 'ansible_connection' from source: unknown 22225 1726882785.37486: variable 'ansible_module_compression' from source: unknown 22225 1726882785.37551: variable 'ansible_shell_type' from source: unknown 22225 1726882785.37557: variable 'ansible_shell_executable' from source: unknown 22225 1726882785.37561: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882785.37563: variable 'ansible_pipelining' from source: unknown 22225 1726882785.37566: variable 'ansible_timeout' from source: unknown 22225 1726882785.37569: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882785.37770: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22225 1726882785.37774: variable 'omit' from source: magic vars 22225 1726882785.37778: starting attempt loop 22225 1726882785.37781: running the handler 22225 1726882785.37783: _low_level_execute_command(): starting 22225 1726882785.37791: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22225 1726882785.38650: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882785.38714: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882785.38738: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882785.38774: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882785.38870: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882785.40680: stdout chunk (state=3): >>>/root <<< 22225 1726882785.40901: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882785.40904: stdout chunk (state=3): >>><<< 22225 1726882785.40907: stderr chunk (state=3): >>><<< 22225 1726882785.41047: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882785.41052: _low_level_execute_command(): starting 22225 1726882785.41056: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882785.4094708-23671-100983002025447 `" && echo ansible-tmp-1726882785.4094708-23671-100983002025447="` echo /root/.ansible/tmp/ansible-tmp-1726882785.4094708-23671-100983002025447 `" ) && sleep 0' 22225 1726882785.41737: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22225 1726882785.41764: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882785.41781: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882785.41801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882785.41818: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 <<< 22225 1726882785.41884: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882785.41956: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882785.41995: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882785.42024: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882785.42083: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882785.44046: stdout chunk (state=3): >>>ansible-tmp-1726882785.4094708-23671-100983002025447=/root/.ansible/tmp/ansible-tmp-1726882785.4094708-23671-100983002025447 <<< 22225 1726882785.44212: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882785.44251: stderr chunk (state=3): >>><<< 22225 1726882785.44254: stdout chunk (state=3): >>><<< 22225 1726882785.44427: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882785.4094708-23671-100983002025447=/root/.ansible/tmp/ansible-tmp-1726882785.4094708-23671-100983002025447 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882785.44433: variable 'ansible_module_compression' from source: unknown 22225 1726882785.44436: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-2222502xecypi/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 22225 1726882785.44438: variable 'ansible_facts' from source: unknown 22225 1726882785.44519: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882785.4094708-23671-100983002025447/AnsiballZ_command.py 22225 1726882785.44683: Sending initial data 22225 1726882785.44736: Sent initial data (156 bytes) 22225 1726882785.45306: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882785.45331: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration <<< 22225 1726882785.45343: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882785.45381: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882785.45394: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882785.45458: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882785.47041: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22225 1726882785.47086: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22225 1726882785.47135: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2222502xecypi/tmpedlnhs5o /root/.ansible/tmp/ansible-tmp-1726882785.4094708-23671-100983002025447/AnsiballZ_command.py <<< 22225 1726882785.47143: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882785.4094708-23671-100983002025447/AnsiballZ_command.py" <<< 22225 1726882785.47184: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-2222502xecypi/tmpedlnhs5o" to remote "/root/.ansible/tmp/ansible-tmp-1726882785.4094708-23671-100983002025447/AnsiballZ_command.py" <<< 22225 1726882785.47193: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882785.4094708-23671-100983002025447/AnsiballZ_command.py" <<< 22225 1726882785.47965: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882785.48003: stderr chunk (state=3): >>><<< 22225 1726882785.48129: stdout chunk (state=3): >>><<< 22225 1726882785.48133: done transferring module to remote 22225 1726882785.48135: _low_level_execute_command(): starting 22225 1726882785.48138: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882785.4094708-23671-100983002025447/ /root/.ansible/tmp/ansible-tmp-1726882785.4094708-23671-100983002025447/AnsiballZ_command.py && sleep 0' 22225 1726882785.48786: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address <<< 22225 1726882785.48790: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882785.48867: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882785.48882: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882785.48945: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882785.50765: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882785.50833: stderr chunk (state=3): >>><<< 22225 1726882785.50837: stdout chunk (state=3): >>><<< 22225 1726882785.50840: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882785.50843: _low_level_execute_command(): starting 22225 1726882785.50845: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882785.4094708-23671-100983002025447/AnsiballZ_command.py && sleep 0' 22225 1726882785.51526: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882785.51531: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found <<< 22225 1726882785.51534: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882785.51536: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882785.51539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882785.51572: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882785.51575: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882785.51638: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882785.68615: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "netns", "delete", "ns1"], "start": "2024-09-20 21:39:45.678444", "end": "2024-09-20 21:39:45.683260", "delta": "0:00:00.004816", "msg": "", "invocation": {"module_args": {"_raw_params": "ip netns delete ns1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 22225 1726882785.70332: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. <<< 22225 1726882785.70337: stdout chunk (state=3): >>><<< 22225 1726882785.70339: stderr chunk (state=3): >>><<< 22225 1726882785.70342: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "netns", "delete", "ns1"], "start": "2024-09-20 21:39:45.678444", "end": "2024-09-20 21:39:45.683260", "delta": "0:00:00.004816", "msg": "", "invocation": {"module_args": {"_raw_params": "ip netns delete ns1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. 22225 1726882785.70345: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip netns delete ns1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882785.4094708-23671-100983002025447/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22225 1726882785.70347: _low_level_execute_command(): starting 22225 1726882785.70361: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882785.4094708-23671-100983002025447/ > /dev/null 2>&1 && sleep 0' 22225 1726882785.71105: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882785.71166: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882785.71186: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882785.71214: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882785.71311: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882785.73360: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882785.73365: stdout chunk (state=3): >>><<< 22225 1726882785.73367: stderr chunk (state=3): >>><<< 22225 1726882785.73735: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882785.73740: handler run complete 22225 1726882785.73742: Evaluated conditional (False): False 22225 1726882785.73745: attempt loop complete, returning result 22225 1726882785.73747: _execute() done 22225 1726882785.73749: dumping result to json 22225 1726882785.73751: done dumping result, returning 22225 1726882785.73753: done running TaskExecutor() for managed_node1/TASK: Clean up namespace [0affc7ec-ae25-ec05-55b7-0000000000b4] 22225 1726882785.73755: sending task result for task 0affc7ec-ae25-ec05-55b7-0000000000b4 ok: [managed_node1] => { "changed": false, "cmd": [ "ip", "netns", "delete", "ns1" ], "delta": "0:00:00.004816", "end": "2024-09-20 21:39:45.683260", "rc": 0, "start": "2024-09-20 21:39:45.678444" } 22225 1726882785.74008: no more pending results, returning what we have 22225 1726882785.74013: results queue empty 22225 1726882785.74014: checking for any_errors_fatal 22225 1726882785.74021: done checking for any_errors_fatal 22225 1726882785.74023: checking for max_fail_percentage 22225 1726882785.74025: done checking for max_fail_percentage 22225 1726882785.74026: checking to see if all hosts have failed and the running result is not ok 22225 1726882785.74027: done checking to see if all hosts have failed 22225 1726882785.74028: getting the remaining hosts for this loop 22225 1726882785.74030: done getting the remaining hosts for this loop 22225 1726882785.74037: getting the next task for host managed_node1 22225 1726882785.74044: done getting next task for host managed_node1 22225 1726882785.74047: ^ task is: TASK: Verify network state restored to default 22225 1726882785.74050: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=7, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882785.74053: getting variables 22225 1726882785.74055: in VariableManager get_vars() 22225 1726882785.74102: Calling all_inventory to load vars for managed_node1 22225 1726882785.74106: Calling groups_inventory to load vars for managed_node1 22225 1726882785.74108: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882785.74326: Calling all_plugins_play to load vars for managed_node1 22225 1726882785.74332: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882785.74337: Calling groups_plugins_play to load vars for managed_node1 22225 1726882785.75079: done sending task result for task 0affc7ec-ae25-ec05-55b7-0000000000b4 22225 1726882785.75084: WORKER PROCESS EXITING 22225 1726882785.76580: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882785.80654: done with get_vars() 22225 1726882785.80696: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:113 Friday 20 September 2024 21:39:45 -0400 (0:00:00.455) 0:00:41.201 ****** 22225 1726882785.81042: entering _queue_task() for managed_node1/include_tasks 22225 1726882785.81853: worker is 1 (out of 1 available) 22225 1726882785.81869: exiting _queue_task() for managed_node1/include_tasks 22225 1726882785.81882: done queuing things up, now waiting for results queue to drain 22225 1726882785.81884: waiting for pending results... 22225 1726882785.82227: running TaskExecutor() for managed_node1/TASK: Verify network state restored to default 22225 1726882785.82378: in run() - task 0affc7ec-ae25-ec05-55b7-0000000000b5 22225 1726882785.82403: variable 'ansible_search_path' from source: unknown 22225 1726882785.82458: calling self._execute() 22225 1726882785.82590: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882785.82606: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882785.82624: variable 'omit' from source: magic vars 22225 1726882785.83086: variable 'ansible_distribution_major_version' from source: facts 22225 1726882785.83111: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882785.83128: _execute() done 22225 1726882785.83137: dumping result to json 22225 1726882785.83145: done dumping result, returning 22225 1726882785.83155: done running TaskExecutor() for managed_node1/TASK: Verify network state restored to default [0affc7ec-ae25-ec05-55b7-0000000000b5] 22225 1726882785.83165: sending task result for task 0affc7ec-ae25-ec05-55b7-0000000000b5 22225 1726882785.83456: no more pending results, returning what we have 22225 1726882785.83463: in VariableManager get_vars() 22225 1726882785.83521: Calling all_inventory to load vars for managed_node1 22225 1726882785.83526: Calling groups_inventory to load vars for managed_node1 22225 1726882785.83529: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882785.83546: Calling all_plugins_play to load vars for managed_node1 22225 1726882785.83549: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882785.83553: Calling groups_plugins_play to load vars for managed_node1 22225 1726882785.84186: done sending task result for task 0affc7ec-ae25-ec05-55b7-0000000000b5 22225 1726882785.84190: WORKER PROCESS EXITING 22225 1726882785.87188: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882785.90132: done with get_vars() 22225 1726882785.90165: variable 'ansible_search_path' from source: unknown 22225 1726882785.90188: we have included files to process 22225 1726882785.90189: generating all_blocks data 22225 1726882785.90192: done generating all_blocks data 22225 1726882785.90199: processing included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 22225 1726882785.90200: loading included file: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 22225 1726882785.90204: Loading data from /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 22225 1726882785.90837: done processing included file 22225 1726882785.90839: iterating over new_blocks loaded from include file 22225 1726882785.90841: in VariableManager get_vars() 22225 1726882785.90864: done with get_vars() 22225 1726882785.90866: filtering new block on tags 22225 1726882785.90890: done filtering new block on tags 22225 1726882785.90893: done iterating over new_blocks loaded from include file included: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed_node1 22225 1726882785.90898: extending task lists for all hosts with included blocks 22225 1726882785.94965: done extending task lists 22225 1726882785.94967: done processing included files 22225 1726882785.94968: results queue empty 22225 1726882785.94969: checking for any_errors_fatal 22225 1726882785.94978: done checking for any_errors_fatal 22225 1726882785.94980: checking for max_fail_percentage 22225 1726882785.94982: done checking for max_fail_percentage 22225 1726882785.94983: checking to see if all hosts have failed and the running result is not ok 22225 1726882785.94984: done checking to see if all hosts have failed 22225 1726882785.94984: getting the remaining hosts for this loop 22225 1726882785.94986: done getting the remaining hosts for this loop 22225 1726882785.94989: getting the next task for host managed_node1 22225 1726882785.94993: done getting next task for host managed_node1 22225 1726882785.94996: ^ task is: TASK: Check routes and DNS 22225 1726882785.94999: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=8, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22225 1726882785.95001: getting variables 22225 1726882785.95002: in VariableManager get_vars() 22225 1726882785.95085: Calling all_inventory to load vars for managed_node1 22225 1726882785.95088: Calling groups_inventory to load vars for managed_node1 22225 1726882785.95091: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882785.95125: Calling all_plugins_play to load vars for managed_node1 22225 1726882785.95130: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882785.95134: Calling groups_plugins_play to load vars for managed_node1 22225 1726882785.97460: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882785.99692: done with get_vars() 22225 1726882785.99728: done getting variables 22225 1726882785.99782: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Friday 20 September 2024 21:39:45 -0400 (0:00:00.188) 0:00:41.390 ****** 22225 1726882785.99817: entering _queue_task() for managed_node1/shell 22225 1726882786.00424: worker is 1 (out of 1 available) 22225 1726882786.00437: exiting _queue_task() for managed_node1/shell 22225 1726882786.00448: done queuing things up, now waiting for results queue to drain 22225 1726882786.00449: waiting for pending results... 22225 1726882786.00579: running TaskExecutor() for managed_node1/TASK: Check routes and DNS 22225 1726882786.00691: in run() - task 0affc7ec-ae25-ec05-55b7-00000000075e 22225 1726882786.00709: variable 'ansible_search_path' from source: unknown 22225 1726882786.00713: variable 'ansible_search_path' from source: unknown 22225 1726882786.00756: calling self._execute() 22225 1726882786.00966: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882786.00970: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882786.00989: variable 'omit' from source: magic vars 22225 1726882786.01748: variable 'ansible_distribution_major_version' from source: facts 22225 1726882786.01787: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882786.01813: variable 'omit' from source: magic vars 22225 1726882786.01867: variable 'omit' from source: magic vars 22225 1726882786.01948: variable 'omit' from source: magic vars 22225 1726882786.01999: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22225 1726882786.02099: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22225 1726882786.02140: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22225 1726882786.02175: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882786.02206: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22225 1726882786.02281: variable 'inventory_hostname' from source: host vars for 'managed_node1' 22225 1726882786.02289: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882786.02293: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882786.02424: Set connection var ansible_connection to ssh 22225 1726882786.02440: Set connection var ansible_pipelining to False 22225 1726882786.02486: Set connection var ansible_shell_executable to /bin/sh 22225 1726882786.02489: Set connection var ansible_timeout to 10 22225 1726882786.02494: Set connection var ansible_shell_type to sh 22225 1726882786.02496: Set connection var ansible_module_compression to ZIP_DEFLATED 22225 1726882786.02508: variable 'ansible_shell_executable' from source: unknown 22225 1726882786.02510: variable 'ansible_connection' from source: unknown 22225 1726882786.02514: variable 'ansible_module_compression' from source: unknown 22225 1726882786.02516: variable 'ansible_shell_type' from source: unknown 22225 1726882786.02519: variable 'ansible_shell_executable' from source: unknown 22225 1726882786.02526: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882786.02531: variable 'ansible_pipelining' from source: unknown 22225 1726882786.02647: variable 'ansible_timeout' from source: unknown 22225 1726882786.02651: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882786.02776: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22225 1726882786.02792: variable 'omit' from source: magic vars 22225 1726882786.02799: starting attempt loop 22225 1726882786.02802: running the handler 22225 1726882786.02813: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22225 1726882786.02851: _low_level_execute_command(): starting 22225 1726882786.02859: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22225 1726882786.03966: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882786.04006: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882786.04091: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882786.05882: stdout chunk (state=3): >>>/root <<< 22225 1726882786.06080: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882786.06100: stderr chunk (state=3): >>><<< 22225 1726882786.06110: stdout chunk (state=3): >>><<< 22225 1726882786.06146: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882786.06167: _low_level_execute_command(): starting 22225 1726882786.06200: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882786.061534-23699-123367041165066 `" && echo ansible-tmp-1726882786.061534-23699-123367041165066="` echo /root/.ansible/tmp/ansible-tmp-1726882786.061534-23699-123367041165066 `" ) && sleep 0' 22225 1726882786.07134: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882786.07276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882786.07369: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882786.07420: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882786.07524: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882786.09498: stdout chunk (state=3): >>>ansible-tmp-1726882786.061534-23699-123367041165066=/root/.ansible/tmp/ansible-tmp-1726882786.061534-23699-123367041165066 <<< 22225 1726882786.09757: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882786.09761: stdout chunk (state=3): >>><<< 22225 1726882786.09764: stderr chunk (state=3): >>><<< 22225 1726882786.09943: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882786.061534-23699-123367041165066=/root/.ansible/tmp/ansible-tmp-1726882786.061534-23699-123367041165066 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882786.09947: variable 'ansible_module_compression' from source: unknown 22225 1726882786.09999: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-2222502xecypi/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 22225 1726882786.10056: variable 'ansible_facts' from source: unknown 22225 1726882786.10179: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882786.061534-23699-123367041165066/AnsiballZ_command.py 22225 1726882786.10390: Sending initial data 22225 1726882786.10400: Sent initial data (155 bytes) 22225 1726882786.11694: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22225 1726882786.11720: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882786.11945: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882786.12013: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882786.12183: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882786.13898: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22225 1726882786.13951: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22225 1726882786.14369: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-2222502xecypi/tmpmwfzj2c0 /root/.ansible/tmp/ansible-tmp-1726882786.061534-23699-123367041165066/AnsiballZ_command.py <<< 22225 1726882786.14373: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882786.061534-23699-123367041165066/AnsiballZ_command.py" <<< 22225 1726882786.14376: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-2222502xecypi/tmpmwfzj2c0" to remote "/root/.ansible/tmp/ansible-tmp-1726882786.061534-23699-123367041165066/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882786.061534-23699-123367041165066/AnsiballZ_command.py" <<< 22225 1726882786.15671: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882786.15826: stderr chunk (state=3): >>><<< 22225 1726882786.15840: stdout chunk (state=3): >>><<< 22225 1726882786.15866: done transferring module to remote 22225 1726882786.15882: _low_level_execute_command(): starting 22225 1726882786.15890: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882786.061534-23699-123367041165066/ /root/.ansible/tmp/ansible-tmp-1726882786.061534-23699-123367041165066/AnsiballZ_command.py && sleep 0' 22225 1726882786.16681: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22225 1726882786.16704: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882786.16720: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882786.16820: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882786.16843: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' <<< 22225 1726882786.16859: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882786.16879: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882786.16963: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882786.18842: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882786.18939: stderr chunk (state=3): >>><<< 22225 1726882786.18950: stdout chunk (state=3): >>><<< 22225 1726882786.18976: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882786.18985: _low_level_execute_command(): starting 22225 1726882786.18995: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882786.061534-23699-123367041165066/AnsiballZ_command.py && sleep 0' 22225 1726882786.19684: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22225 1726882786.19712: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882786.19828: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882786.19869: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882786.19927: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882786.37327: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 0a:ff:c5:8e:44:af brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.15.7/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0\n valid_lft 2885sec preferred_lft 2885sec\n inet6 fe80::8ff:c5ff:fe8e:44af/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.12.1 dev eth0 proto dhcp src 10.31.15.7 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.15.7 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8).\n# Do not edit.\n#\n# This file might be symlinked as /etc/resolv.conf. If you're looking at\n# /etc/resolv.conf and seeing this text, you have followed the symlink.\n#\n# This is a dynamic resolv.conf file for connecting local clients to the\n# internal DNS stub resolver of systemd-resolved. This file lists all\n# configured search domains.\n#\n# Run \"resolvectl status\" to see details about the uplink DNS servers\n# currently in use.\n#\n# Third party programs should typically not access this file directly, but only\n# through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a\n# different way, replace this symlink by a static file or a different symlink.\n#\n# See man:systemd-resolved.service(8) for details about the supported modes of\n# operation for /etc/resolv.conf.\n\nnameserver 127.0.0.53\noptions edns0 trust-ad\nsearch us-east-1.aws.redhat.com", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 21:39:46.361949", "end": "2024-09-20 21:39:46.371333", "delta": "0:00:00.009384", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 22225 1726882786.39512: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. <<< 22225 1726882786.39516: stderr chunk (state=3): >>><<< 22225 1726882786.39519: stdout chunk (state=3): >>><<< 22225 1726882786.39526: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 0a:ff:c5:8e:44:af brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.15.7/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0\n valid_lft 2885sec preferred_lft 2885sec\n inet6 fe80::8ff:c5ff:fe8e:44af/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.12.1 dev eth0 proto dhcp src 10.31.15.7 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.15.7 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8).\n# Do not edit.\n#\n# This file might be symlinked as /etc/resolv.conf. If you're looking at\n# /etc/resolv.conf and seeing this text, you have followed the symlink.\n#\n# This is a dynamic resolv.conf file for connecting local clients to the\n# internal DNS stub resolver of systemd-resolved. This file lists all\n# configured search domains.\n#\n# Run \"resolvectl status\" to see details about the uplink DNS servers\n# currently in use.\n#\n# Third party programs should typically not access this file directly, but only\n# through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a\n# different way, replace this symlink by a static file or a different symlink.\n#\n# See man:systemd-resolved.service(8) for details about the supported modes of\n# operation for /etc/resolv.conf.\n\nnameserver 127.0.0.53\noptions edns0 trust-ad\nsearch us-east-1.aws.redhat.com", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 21:39:46.361949", "end": "2024-09-20 21:39:46.371333", "delta": "0:00:00.009384", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.7 closed. 22225 1726882786.39534: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882786.061534-23699-123367041165066/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22225 1726882786.39536: _low_level_execute_command(): starting 22225 1726882786.39539: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882786.061534-23699-123367041165066/ > /dev/null 2>&1 && sleep 0' 22225 1726882786.40274: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22225 1726882786.40282: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882786.40285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882786.40290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882786.40446: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 <<< 22225 1726882786.40450: stderr chunk (state=3): >>>debug2: match not found <<< 22225 1726882786.40452: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882786.40454: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 22225 1726882786.40456: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.7 is address <<< 22225 1726882786.40458: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 22225 1726882786.40460: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22225 1726882786.40462: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22225 1726882786.40464: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22225 1726882786.40466: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 <<< 22225 1726882786.40468: stderr chunk (state=3): >>>debug2: match found <<< 22225 1726882786.40470: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22225 1726882786.40515: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK <<< 22225 1726882786.41042: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22225 1726882786.41239: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22225 1726882786.43082: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22225 1726882786.43086: stdout chunk (state=3): >>><<< 22225 1726882786.43091: stderr chunk (state=3): >>><<< 22225 1726882786.43110: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.7 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.7 originally 10.31.15.7 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/d9f6ac3c31' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22225 1726882786.43117: handler run complete 22225 1726882786.43149: Evaluated conditional (False): False 22225 1726882786.43159: attempt loop complete, returning result 22225 1726882786.43162: _execute() done 22225 1726882786.43165: dumping result to json 22225 1726882786.43172: done dumping result, returning 22225 1726882786.43185: done running TaskExecutor() for managed_node1/TASK: Check routes and DNS [0affc7ec-ae25-ec05-55b7-00000000075e] 22225 1726882786.43187: sending task result for task 0affc7ec-ae25-ec05-55b7-00000000075e 22225 1726882786.43335: done sending task result for task 0affc7ec-ae25-ec05-55b7-00000000075e 22225 1726882786.43339: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.009384", "end": "2024-09-20 21:39:46.371333", "rc": 0, "start": "2024-09-20 21:39:46.361949" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 0a:ff:c5:8e:44:af brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.15.7/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0 valid_lft 2885sec preferred_lft 2885sec inet6 fe80::8ff:c5ff:fe8e:44af/64 scope link noprefixroute valid_lft forever preferred_lft forever IP ROUTE default via 10.31.12.1 dev eth0 proto dhcp src 10.31.15.7 metric 100 10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.15.7 metric 100 IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8). # Do not edit. # # This file might be symlinked as /etc/resolv.conf. If you're looking at # /etc/resolv.conf and seeing this text, you have followed the symlink. # # This is a dynamic resolv.conf file for connecting local clients to the # internal DNS stub resolver of systemd-resolved. This file lists all # configured search domains. # # Run "resolvectl status" to see details about the uplink DNS servers # currently in use. # # Third party programs should typically not access this file directly, but only # through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a # different way, replace this symlink by a static file or a different symlink. # # See man:systemd-resolved.service(8) for details about the supported modes of # operation for /etc/resolv.conf. nameserver 127.0.0.53 options edns0 trust-ad search us-east-1.aws.redhat.com 22225 1726882786.43429: no more pending results, returning what we have 22225 1726882786.43433: results queue empty 22225 1726882786.43435: checking for any_errors_fatal 22225 1726882786.43437: done checking for any_errors_fatal 22225 1726882786.43443: checking for max_fail_percentage 22225 1726882786.43446: done checking for max_fail_percentage 22225 1726882786.43447: checking to see if all hosts have failed and the running result is not ok 22225 1726882786.43448: done checking to see if all hosts have failed 22225 1726882786.43449: getting the remaining hosts for this loop 22225 1726882786.43451: done getting the remaining hosts for this loop 22225 1726882786.43456: getting the next task for host managed_node1 22225 1726882786.43465: done getting next task for host managed_node1 22225 1726882786.43468: ^ task is: TASK: Verify DNS and network connectivity 22225 1726882786.43472: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=8, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 22225 1726882786.43476: getting variables 22225 1726882786.43478: in VariableManager get_vars() 22225 1726882786.43743: Calling all_inventory to load vars for managed_node1 22225 1726882786.43746: Calling groups_inventory to load vars for managed_node1 22225 1726882786.43749: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882786.43761: Calling all_plugins_play to load vars for managed_node1 22225 1726882786.43764: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882786.43768: Calling groups_plugins_play to load vars for managed_node1 22225 1726882786.46256: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882786.48483: done with get_vars() 22225 1726882786.48508: done getting variables 22225 1726882786.48569: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Friday 20 September 2024 21:39:46 -0400 (0:00:00.487) 0:00:41.878 ****** 22225 1726882786.48600: entering _queue_task() for managed_node1/shell 22225 1726882786.48955: worker is 1 (out of 1 available) 22225 1726882786.48969: exiting _queue_task() for managed_node1/shell 22225 1726882786.48981: done queuing things up, now waiting for results queue to drain 22225 1726882786.48982: waiting for pending results... 22225 1726882786.49448: running TaskExecutor() for managed_node1/TASK: Verify DNS and network connectivity 22225 1726882786.49459: in run() - task 0affc7ec-ae25-ec05-55b7-00000000075f 22225 1726882786.49464: variable 'ansible_search_path' from source: unknown 22225 1726882786.49467: variable 'ansible_search_path' from source: unknown 22225 1726882786.49470: calling self._execute() 22225 1726882786.49570: variable 'ansible_host' from source: host vars for 'managed_node1' 22225 1726882786.49578: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 22225 1726882786.49652: variable 'omit' from source: magic vars 22225 1726882786.50008: variable 'ansible_distribution_major_version' from source: facts 22225 1726882786.50020: Evaluated conditional (ansible_distribution_major_version != '6'): True 22225 1726882786.50173: variable 'ansible_facts' from source: unknown 22225 1726882786.51129: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): False 22225 1726882786.51133: when evaluation is False, skipping this task 22225 1726882786.51136: _execute() done 22225 1726882786.51138: dumping result to json 22225 1726882786.51141: done dumping result, returning 22225 1726882786.51143: done running TaskExecutor() for managed_node1/TASK: Verify DNS and network connectivity [0affc7ec-ae25-ec05-55b7-00000000075f] 22225 1726882786.51145: sending task result for task 0affc7ec-ae25-ec05-55b7-00000000075f 22225 1726882786.51275: done sending task result for task 0affc7ec-ae25-ec05-55b7-00000000075f 22225 1726882786.51395: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_facts[\"distribution\"] == \"CentOS\"", "skip_reason": "Conditional result was False" } 22225 1726882786.51538: no more pending results, returning what we have 22225 1726882786.51542: results queue empty 22225 1726882786.51543: checking for any_errors_fatal 22225 1726882786.51554: done checking for any_errors_fatal 22225 1726882786.51554: checking for max_fail_percentage 22225 1726882786.51556: done checking for max_fail_percentage 22225 1726882786.51557: checking to see if all hosts have failed and the running result is not ok 22225 1726882786.51558: done checking to see if all hosts have failed 22225 1726882786.51558: getting the remaining hosts for this loop 22225 1726882786.51560: done getting the remaining hosts for this loop 22225 1726882786.51564: getting the next task for host managed_node1 22225 1726882786.51571: done getting next task for host managed_node1 22225 1726882786.51573: ^ task is: TASK: meta (flush_handlers) 22225 1726882786.51575: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882786.51579: getting variables 22225 1726882786.51580: in VariableManager get_vars() 22225 1726882786.51620: Calling all_inventory to load vars for managed_node1 22225 1726882786.51625: Calling groups_inventory to load vars for managed_node1 22225 1726882786.51628: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882786.51639: Calling all_plugins_play to load vars for managed_node1 22225 1726882786.51643: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882786.51646: Calling groups_plugins_play to load vars for managed_node1 22225 1726882786.52842: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882786.54047: done with get_vars() 22225 1726882786.54078: done getting variables 22225 1726882786.54156: in VariableManager get_vars() 22225 1726882786.54172: Calling all_inventory to load vars for managed_node1 22225 1726882786.54174: Calling groups_inventory to load vars for managed_node1 22225 1726882786.54176: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882786.54181: Calling all_plugins_play to load vars for managed_node1 22225 1726882786.54184: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882786.54187: Calling groups_plugins_play to load vars for managed_node1 22225 1726882786.55586: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882786.56766: done with get_vars() 22225 1726882786.56794: done queuing things up, now waiting for results queue to drain 22225 1726882786.56796: results queue empty 22225 1726882786.56797: checking for any_errors_fatal 22225 1726882786.56799: done checking for any_errors_fatal 22225 1726882786.56799: checking for max_fail_percentage 22225 1726882786.56800: done checking for max_fail_percentage 22225 1726882786.56800: checking to see if all hosts have failed and the running result is not ok 22225 1726882786.56801: done checking to see if all hosts have failed 22225 1726882786.56801: getting the remaining hosts for this loop 22225 1726882786.56802: done getting the remaining hosts for this loop 22225 1726882786.56804: getting the next task for host managed_node1 22225 1726882786.56807: done getting next task for host managed_node1 22225 1726882786.56808: ^ task is: TASK: meta (flush_handlers) 22225 1726882786.56809: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882786.56811: getting variables 22225 1726882786.56812: in VariableManager get_vars() 22225 1726882786.56824: Calling all_inventory to load vars for managed_node1 22225 1726882786.56826: Calling groups_inventory to load vars for managed_node1 22225 1726882786.56828: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882786.56832: Calling all_plugins_play to load vars for managed_node1 22225 1726882786.56835: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882786.56836: Calling groups_plugins_play to load vars for managed_node1 22225 1726882786.57929: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882786.60007: done with get_vars() 22225 1726882786.60039: done getting variables 22225 1726882786.60096: in VariableManager get_vars() 22225 1726882786.60113: Calling all_inventory to load vars for managed_node1 22225 1726882786.60115: Calling groups_inventory to load vars for managed_node1 22225 1726882786.60118: Calling all_plugins_inventory to load vars for managed_node1 22225 1726882786.60125: Calling all_plugins_play to load vars for managed_node1 22225 1726882786.60128: Calling groups_plugins_inventory to load vars for managed_node1 22225 1726882786.60131: Calling groups_plugins_play to load vars for managed_node1 22225 1726882786.61599: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22225 1726882786.64051: done with get_vars() 22225 1726882786.64092: done queuing things up, now waiting for results queue to drain 22225 1726882786.64094: results queue empty 22225 1726882786.64095: checking for any_errors_fatal 22225 1726882786.64097: done checking for any_errors_fatal 22225 1726882786.64098: checking for max_fail_percentage 22225 1726882786.64099: done checking for max_fail_percentage 22225 1726882786.64100: checking to see if all hosts have failed and the running result is not ok 22225 1726882786.64100: done checking to see if all hosts have failed 22225 1726882786.64101: getting the remaining hosts for this loop 22225 1726882786.64103: done getting the remaining hosts for this loop 22225 1726882786.64112: getting the next task for host managed_node1 22225 1726882786.64116: done getting next task for host managed_node1 22225 1726882786.64117: ^ task is: None 22225 1726882786.64118: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22225 1726882786.64119: done queuing things up, now waiting for results queue to drain 22225 1726882786.64120: results queue empty 22225 1726882786.64121: checking for any_errors_fatal 22225 1726882786.64140: done checking for any_errors_fatal 22225 1726882786.64140: checking for max_fail_percentage 22225 1726882786.64142: done checking for max_fail_percentage 22225 1726882786.64143: checking to see if all hosts have failed and the running result is not ok 22225 1726882786.64143: done checking to see if all hosts have failed 22225 1726882786.64146: getting the next task for host managed_node1 22225 1726882786.64149: done getting next task for host managed_node1 22225 1726882786.64150: ^ task is: None 22225 1726882786.64151: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node1 : ok=75 changed=2 unreachable=0 failed=0 skipped=63 rescued=0 ignored=0 Friday 20 September 2024 21:39:46 -0400 (0:00:00.157) 0:00:42.036 ****** =============================================================================== Gathering Facts --------------------------------------------------------- 3.63s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_nm.yml:6 fedora.linux_system_roles.network : Configure networking connection profiles --- 2.91s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Check which services are running ---- 2.58s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.45s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Gather the minimum subset of ansible_facts required by the network role test --- 2.12s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Gathering Facts --------------------------------------------------------- 2.08s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:3 Install iproute --------------------------------------------------------- 1.57s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 fedora.linux_system_roles.network : Check which packages are installed --- 1.52s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Install iproute --------------------------------------------------------- 1.37s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Ensure ping6 command is present ----------------------------------------- 1.35s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:81 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 1.29s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Create veth interface veth0 --------------------------------------------- 1.26s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.05s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Check which packages are installed --- 0.91s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.70s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Gather current interface info ------------------------------------------- 0.54s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Get NM profile info ----------------------------------------------------- 0.51s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Stat profile file ------------------------------------------------------- 0.50s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Check if system is ostree ----------------------------------------------- 0.50s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Test gateway can be pinged ---------------------------------------------- 0.50s /tmp/collections-AQL/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:86 22225 1726882786.64427: RUNNING CLEANUP